The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Category Archives: Cloud Computing
Ampere to Acquire OnSpecta to Accelerate AI Inference on Cloud-Native Applications – PRNewswire
Posted: July 29, 2021 at 8:59 pm
SANTA CLARA, Calif., July 28, 2021 /PRNewswire/ --Ampere Computing today announced it has agreed to acquire AI technology startup OnSpecta, strengthening Ampere Altra performance with AI inference applications. The OnSpecta Deep Learning Software (DLS) AI optimization engine can deliver significant performance enhancements over commonly used CPU-based machine learning (ML) frameworks. The companies have already been collaborating and have demonstrated over 4x acceleration on Ampere-based instances running popular AI-inference workloads. The acquisition will include an optimized model zoo with object detection, video processing and recommendation engines. Terms were not disclosed and the acquisition is expected to close in August, subject to customary closing conditions.
"We are excited to welcome the talented OnSpecta team to Ampere," said Renee James, founder, chairman and CEO of Ampere Computing. " The addition of deep learning expertise will enable Ampere to deliver a more robust platform for inference task processing with lower power, higher performance and better predictability than ever. This acquisition underscores our commitment to delivering a truly differentiated cloud native computing platform for our customers in both cloud and edge deployments."
According to IDC Research, the AI server market is expected to be over $26B by 2024 with an annual growth rate of 13.7%. Ampere customers are seeking ways to manage the costs and the growing requirements for AI inference tasks in both centralized and edge-based infrastructure scenarios. DLS is a seamless binary drop-in library for many popular AI frameworks and will accelerate inference significantly on Ampere Altra. It enables the use of the Altra-native FP16 data format that can double performance over FP32 formats without significant accuracy loss or model retraining.
"This is a natural progression to the strong collaboration we already have with Ampere," said Indra Mohan, co-founder and CEO of OnSpecta."Our team will greatly benefit from being a part of Ampere as we help build upon the great success of Ampere Altra and provide critical support to customers as they apply the Altra product family to a wide variety of AI inference use cases."
"We have already seen the powerful performance and ease-of-use of Ampere Altra and OnSpecta on the Oracle OCI Ampere A1 instance," said Clay Magouyrk, executive vice president of Oracle Cloud Infrastructure. "With DLS compatibility on all major open source AI frameworks including Tensorflow, PyTorch and ONNX, and the predictable performance of Ampere Altra, we expect to see continued innovation on OCI Ampere A1 shapes for AI inference workloads."
About AmpereAmpere is designing the future of hyperscale cloud and edge computing with the world's first cloud native processor. Built for the cloud with a modern 64-bit Arm server-based architecture, Ampere gives customers the freedom to accelerate the delivery of all cloud computing applications. With industry-leading cloud performance, power efficiency and scalability, Ampere processors are tailored for the continued growth of cloud and edge computing.For more information, visit http://www.amperecomputing.com
About OnSpectaOnSpecta's software, DLS, significantly accelerates AI inference workloads in the cloud and the edge. The company is headquartered in Redwood City, CA, and is led by its founders Indra Mohan (CEO), and Victor Jakubiuk (CTO). Its investors include Sage Hill Capital Partners, WestWave Capital, BMNT and GoingVC Partners.
Press Contact: Nicole Conley [emailprotected]
SOURCE Ampere
http://www.amperecomputing.com
Go here to read the rest:
Ampere to Acquire OnSpecta to Accelerate AI Inference on Cloud-Native Applications - PRNewswire
Posted in Cloud Computing
Comments Off on Ampere to Acquire OnSpecta to Accelerate AI Inference on Cloud-Native Applications – PRNewswire
National Education Equity Lab Launches Initiative to Prepare 10,000 Underserved Students for In-Demand Cloud Computing Careers – PRNewswire
Posted: at 8:59 pm
NEW YORK, July 28, 2021 /PRNewswire/ -- The National Education Equity Lab (Ed Equity Lab) today announced a new initiative with Amazon Web Services (AWS) designedto prepare more than 10,000students in underserved high schools across the nation for careers in cloud computing by 2025. As part of Amazon's ongoing commitment https://www.aboutamazon.com/news/workplace/amazon-to-help-29-million-people-around-the-world-grow-their-tech-skills-with-free-cloud-computing-skills-training-by-2025to help 29 million people worldwide increase their technical skills by 2025, the new collaboration, launching this fall, will enable students in low-income school districts to access AWS cloud computing educational content and resources offered by Arizona State University (ASU) at no cost to students.
"Students from underserved school districts and communities face challenges that prevent them from pursuing and succeeding in some of the country's fastest-growing technical careers," said Wil Zemp, Director of Education to Workforce at AWS. "It will take intentional, proactive effort by employers, education leaders, and the tech industry to remove those barriers and build more equitable pathways to economic mobility."
For each of the last five years, cloud computing has been named one of the country's most in-demand skills by LinkedIn, and the U.S. Department of Labor predicts that increasing demand for cloud computing will be a primary driver of job growth across the IT sector in the coming decade. Ed Equity Labwill collaborate with AWS and ASU to provide high school students with knowledge and skills to move toward careers in cloud computing.
ASU and Ed Equity Lab will provide the high school students with the opportunity to earn college credit, which will be transferable toexisting associate and bachelor's degree programs in cloud computing across the country. Students who successfully complete the rigorous courses will earn college credits from ASU and have the opportunity to earn an industry-recognized AWS Certification credential. The courses will be taught by ASU faculty, trained by AWS to help students become proficient in AWS technology.
"In an increasingly dynamic and global economy, higher education institutions have a responsibility to bridge the gap between K-12 schools and the workforceand foster the sort of experiences and opportunities that translate to success in higher education and throughout one's career," said Maria Anguiano, Executive VP of ASU's Learning Enterprise. "Together, we're working to break down the historic silos that so often create friction between high school, college, and the world of work."
The EdEquity Lab currently operates in more than 100Title I and Title I-eligible high schools across 35 cities to deliver college credit-bearing courses from top colleges and universities into teacher-led classrooms across the nation, at no cost to students.Since 2019, the Lab's pioneering model has served nearly 3,000 students, with thousands earning widely-transferable college creditsfor freeand building the skills and confidence to succeed in college and beyond.
"Our work is rooted in the belief that talent is evenly distributed, but opportunity is not," said Leslie Cornfeld, Executive Director of The National Educational Equity Lab."This collaboration with AWS and ASU represents the next step in an ongoing effort to shift the status quoand enable students from all backgrounds to fulfill their aspirations and realize their unique potential."
About the National Education Equity Lab
The National Education Equity Lab is a nonprofit working with the Common App, Carnegie Corporation of New York, a Consortium of colleges and universities, and others to advance economic and social mobility opportunities for historically underserved students at scale.
In collaboration with under-resourced high schools nationwide, the National Education Equity Lab delivers and supportsonline, college credit-bearing coursesfrom top colleges and universities intoteacher-ledhigh school classrooms (in-school or virtual), at no cost to students. Students can earn widely-transferable college credits providing the opportunity to advance and demonstrate college readiness, and make college more affordable, accessible, and successful.
Because impact requires more than great content, the National Education Equity Lab offers apackage of additional supports, including one-on-one college mentors, college-mindset videos and messages, and personal technology and hotspots so that access is not a barrier to participation. To learn more, visitEdEquityLab.org.
SOURCE The National Education Equity Lab
Read the original here:
Posted in Cloud Computing
Comments Off on National Education Equity Lab Launches Initiative to Prepare 10,000 Underserved Students for In-Demand Cloud Computing Careers – PRNewswire
Cloud computing | Shaping Europes digital future
Posted: July 21, 2021 at 12:44 am
The global data volume is growing very fast. Whereas cloud computing happens mostly in large data-centres today, by 2025 this trend will reverse: 80% of all data is expected to beprocessed in smart devices closer to the user, known as edge computing.
The availability ofboth edge and cloud computing isessential in a computing continuum to ensure that data isprocessed in the most efficient manner. Energy-efficient and trustworthy edge and cloud infrastructures will be fundamentalfor the sustainable use of edge and cloud computing technologies.
Cloud computing is a key objective to increase Europe's data sovereignty as outlined in the European CommissionsData Strategy,Digital Strategy,Industrial Strategyand theEU recovery plan.
Currently, the European Commission is working on the establishment of aEuropean Alliance on Industrial Data, Edge and Cloud, which will enable the development of several work streams:
EU countrieshave signedajoint declaration on cloudwhere they expressed their will to collaborate towards the creation of a European cloud.
Other initiatives related tocloud computing are:
In parallel, cloud computing and edge computing will be among those digital technologies that will contribute to achieving the sustainability goals of the EuropeanGreen Dealin areas such as farming, mobility, buildings and manufacturing.
The European Union also supports the development of cloud computing in Europe with research and innovation actions under theHorizon 2020 programme.
EU-funded projects will work on novel solutions for federating cloud infrastructures. New cloud-based services will have to respond to high-standard requirements with regard to data protection, performance, resilience and energy-efficiency. The services and infrastructures will have to meet the future digitisation needs of industry and the public sector. Addressing these challenges will also be part of and contribute to the technological ambitions of theNext Generation Internet(NGI).
In addition, the EU intends to invest 2bn via theEuropean Data Strategyin a European High Impact Project that will federate energy-efficient and trustworthy cloud infrastructures and related services. Cloud technologies that have been developed within Horizon 2020-funded research and by market actors will be deployed via the Connecting Europe Facility 2 (for cloud infrastructures interconnection) and Digital Europe (for cloud-to-edge services and cloud marketplaces) Programme.
The European Cloud Initiative of 2016 presented a strategy for public investments to build:
The European Cloud Initiative builds on the results achieved under the2012 European Cloud Strategy.
Read more:
Posted in Cloud Computing
Comments Off on Cloud computing | Shaping Europes digital future
Banks now rely on a few cloud computing giants. That’s creating some unexpected new risks – ZDNet
Posted: at 12:44 am
Outsourcing key banking data and services to a small number of cloud service providers means that those providers have the power to dictate their own terms.
Banks' growing reliance on cloud computing could pose a risk to financial stability and will require stricter oversight, according to top executives from the UK's central bank.
In a report focusing on financial stability in the UK over the past few months, the Bank of England drew attention to the increasing adoption of public cloud services, and voiced concerns about those services being provided by only a handful of huge companies that dominate the market.
The best cloud storage services
Free and cheap personal and small business cloud storage services are everywhere. But, which one is best for you? Let's look at the top cloud storage options.
Read More
Outsourcing key banking data and services to a small number of cloud service providers (CSPs), said the Bank of England, means that those providers have the power to dictate their own terms, potentially at the expense of the stability of the financial system.
SEE: The new SMB stack (free PDF) (TechRepublic)
For example, cloud providers might fail to open up the inner workings of their systems to third-party scrutiny, meaning that it is impossible for customers to know if they are ensuring the level of resilience that is necessary to carry out banking operations.
"As regulators and people concerned with financial stability, as (CSPs) become more integral to the system, we have to get more assurance that they are meeting the level of resilience that we need," Andrew Bailey, the Bank of England governor, told reporters in a press conference.
In the past years, financial institutions have accelerated their plans to scale up their reliance on CSPs. From file sharing and collaboration to fraud detection, through business management and communications, banks have used cloud outsourcing both to run software and access additional processing capacity, and to support IT infrastructure.
Until recently, cloud services were used mostly to run applications at the periphery of banking operations, such as HR systems, with no direct impact on financial services. According to the Bank of England, however, this is now changing, with CSPs being called in to process operations that are more integral to the core running of banks.
"We've crossed a further threshold in terms of what sort of systems and what volumes of systems and data are being outsourced to the cloud," said Sam Woods, the chief executive officer of the Prudential Regulation Authority (PRA). "As you'd expect, we track that quite closely."
Last year, the Bank of Englandopened bidding for a cloud build partner, with the goal of creating a fit-for-purpose cloud environment that could better support operations in a digital-first environment. At the time, the institution said that it had already been in talks with Microsoft's Azure, Google Cloud and Amazon's AWS, and that it would likely be targeting Azure in a first instance. The possibility of adopting a multi-cloud strategy was also raised.
There are many benefits to moving financial services to the public cloud. For example, while using old-fashioned, on-premises data centers incurs extra expenses, a recent analysis by the Bank of England estimated thatadopting the ready-made services offered by hyperscalers could reduce technology infrastructure costs by up to 50%.
Another advantage of public cloud services is that they are more resilient. The sheer scale of CSPs enables them to implement infrastructure that integrates multiple levels of redundancy, and as such, is less vulnerable to failures.
Moving to the cloud, therefore, is not intrinsically detrimental to banking services quite the contrary. But the main sticking point, according to the regulators, lies in the concentration of major players that dominate the cloud market. According to tech analysis firm Gartner's latest numbers,the top five cloud providers currently account for 80% of the market, with Amazon holding a 41% share and Azure representing nearly 20% of the market.
"As of course a market becomes more concentrated around one supplier or a small number of suppliers, those suppliers can exercise market power around of course the cost but also the terms," said Bailey.
"That is where we do have a concern and do have to look carefully because that concentrated power on terms can manifest itself in the form of secrecy, opacity, not providing customers with the information they need in order to be able to monitor the risk in the service. And we have seen some of that going on."
As Bailey stressed, part of the reason for CSPs to remain secretive comes down to better protecting customers, by not opening up key information to potential hackers. But the regulator said that a careful balance has to be maintained on transparency, to enable an appropriate understanding of the risks and resilience of the system without compromising cybersecurity.
Leighton James, CTO of UKCloud, which provides multi-cloud solutions to public sector organizations across the country, explained that these issues are not unprecedented, and it is unsurprising to see them trickle down to financial services.
"We're anxious about cloud providers becoming so big that the terms and conditions are pretty much 'take it or leave it'. We've definitely seen that happening already in the public sector, and we can definitely see it happening in the financial services sector if we are not careful," James told ZDNet.
SEE: Google's new cloud computing tool helps you pick the greenest data centers
According to James, part of the risk stems from traditional banks attempting to compete against new disruptive players in the sector. Financial institutions are now rushing to overhaul their legacy infrastructure and catch up with the digital-native customer experiences that were born in the cloud and are now widely available thanks to fintech companies.
"It's clearly imperative for the financial sector to modernize and adopt digital technologies," said James. "The question becomes how best they can do that by balancing the risk of digital transformation."
And in this scenario, the risks of placing all of banks' eggs in a handful of CSP's baskets is too high, argued James.
The Bank of England has similarly urged financial institutions to exert caution when developing their digital transformation strategies, and is currently in talks with various regulators to discuss how to best tackle those risks.
With cloud concerns widelyshared by other nations, especially in the EU, those discussions are likely to become international, and the UK's central bank predicts that global standards will be created to develop a consistent approach to the issue.
More:
Banks now rely on a few cloud computing giants. That's creating some unexpected new risks - ZDNet
Posted in Cloud Computing
Comments Off on Banks now rely on a few cloud computing giants. That’s creating some unexpected new risks – ZDNet
How is cloud computing revolutionising healthcare? – Healthcare Global – Healthcare News, Magazine and Website
Posted: at 12:44 am
Cloud computing has become the talk of the town, especially within the healthcare niche. The adoption of this state-of-the-art tech innovation has been escalating at a frenetic pace. One recent research study suggests that the global market for cloud technologies in healthcare is projected to reach $64.7 billion by 2025.
The reason behind its recent exponential growth is simple though. If healthcare businesses were simply service providers before, today they're true progressive institutions that depend on their IT infrastructure and departments to gain better clinical, administrative, and financial insights. This helps them make informed decisions.
And that's not all - as patient expectations change with each passing day, and new payment models get added to the equation, cloud technology has become vital to drive efficiency and improve patient care.
There are several things that have been made possible in healthcare due to the rapid adoption of cloud technology.
Most cloud platforms offer better infrastructure and services than individual on-premise storage systems set up by healthcare facilities.
Renting out rack space in a data centre would cost you only a fraction of what it would to set up and maintain an in-house system at such a scale. Additionally, there are substantial savings on technical upgrades, staff, and licenses.
On-premise data centres not only necessitate an investment in hardware early on, but they also come with ongoing costs of managing physical servers, spaces, and cooling solutions among other things.While EHRs have become mainstream in healthcare, storage of data on cloud servers is set to become the new normal, explains Dr Vinati Kamani in one of her recent articles. The use of cloud computing in healthcare saves up on the additional server costs, wherein you only pay for the computing capacity you use while ensuring the safety of sensitive PHI at the same time, she continues.
Therefore, by carefully choosing a cloud hosting platform that will fit the needs of their particular practice, healthcare leaders can easily lower the costs associated with data storage and concentrate both their efforts as well as budget on making the patient experience seamless.
Cyber attacks and thefts have been on the rise in the healthcare space of late. Now is the time that practices and hospitals alike need augmented security protocols that safeguard sensitive patient data.Healthcare leaders are swiftly moving toward hybrid cloud environments which offer the benefit of both private and public cloud to achieve optimum compliance, security, flexibility and the ease to move applications between the two.
In a press release issued by Nutanix, the CIO of the Anne Arundel Medical Center, Dave Lehr said: As a healthcare organisation, were responsible for managing critical clinical and IT applications such as EHR and PACS as well as making sure we have an infrastructure that is secure and scalable to support changing needs such as hybrid cloud-based disaster recovery."
We knew that the right hyperconverged infrastructure would allow us to manage these workloads on a single, cost-effective solution, Lehr continues.
A number of cloud vendors now also offer compliance with the Health Insurance Portability and Accountability Act (HIPAA).
Opting for a compliant cloud service can further ensure that the sensitive patient data within your systems remains protected and adheres to HIPAA rules at all times. This can help you avoid any hefty penalties and keep your facilitys reputation from getting tarnished.
The rapid adoption of collaboration tools like video conferencing and enterprise messaging since the COVID-19 public health emergency hit us last year has presented immense potential towards positively influencing healthcare teams and leadership.
The cloud-based software behind these applications helps ameliorate the clinical workflow and enhances patient care, irrespective of the providers or patients physical locations.
Today, with the developments happening on the cloud technology front, the data collected from remote patient monitoring devices can also be uploaded to the healthcare facilitys dedicated cloud server or the user's private centralised cloud. The platform then keeps a record of all the monitored data which can be retrieved for analysis by the medical personnel during treatment.
The utilisation of cloud storage for storing data from electronic health record systems (EHRs) has helped revolutionise collective patient care, making it less complicated for care providers and their staff to retrieve patient details at any given point in time, even from a remote location.
The majority of cloud platforms also employ essential security features such as multi-factor authentication (MFA) and access controls, that can provide patients with a greater sense of security when it comes to sharing credit card details or social security numbers.
Web-based software also makes it easier for physicians, staff members and patients to access patient portals and employ mobile health applications to receive important health information, such as lab test results, medication reminders and activity trackers.
All in all, cloud computing has presented us with an unprecedented opportunity to make value-based, patient-centric healthcare a reality.
The advantages mentioned above only scratch the surface of cloud technologys true potential. Only those forward looking healthcare leaders that are ready to embrace this technology will know how much more it has in store for healthcare.
Original post:
Posted in Cloud Computing
Comments Off on How is cloud computing revolutionising healthcare? – Healthcare Global – Healthcare News, Magazine and Website
What Data Governance Means to Cloud Computing (And Vice Versa) – insideBIGDATA
Posted: at 12:44 am
The symbiosis between data governance and cloud computing is apparent to any organization with significant cloud investments. The cloud has a plethora of resources that enhance data governance, which is critical for regulatory compliance, risk mitigation, and long term profitability of data assets.
Simultaneously, however, data governance hallmarks of metadata management, data cataloging, and data stewardship are requisite for maximizing the clouds utility by illustrating where data are and employing them for singular use cases like customer 360s, real-time product or service recommendations, and advanced analytics.
Although its difficult to say which of these capabilities is more advantageous to the enterprise, they clearly complement each other.
For data governance, the clouds chief value proposition is that from a standpoint of connectivity, or data integration, or data quality, or cataloging, everythings included so you have an all-in-one solution thats possible because were in a microservices world where you can deploy efficiently and have things kind of seamlessly interoperable, Informatica MDM General Manager Manouj Tahiliani revealed.
The cloud is also a significant contributor to dispersing resources across locations which, if left unchecked, could easily result in silos and ungoverned use. According to Profisee CTO Eric Melcher, the cloud creates the much more distributed enterprise where youve got lots of applications: our HR applications, CRM applications, ERP applications, our customer experience platform. Weve got applications across the line. And by the way, the datas not in that server in the corner, its now all over the place. So, I need governance to understand where everything is.
Data governance is essential for keeping cloud deployments orderly, while the cloud enriches the very means of effecting governance by allowing organizations to position resources wherever theyre most advantageous for modern computingparticularly when supported by governance staples of metadata management, data cataloguing, data modeling, and data stewardship.
Metadata Intelligence
Metadata is utilitarian for governing data in cloud settings. Its particularly beneficial for assembling data across numerous sources to use in a single domain such as customer, product, supply chain information, and more. Tahiliani referred to the notion of metadata intelligence as foundational to implementing data quality and data integration. Coupling metadata intelligence with Master Data Management exploits the sundry of cloud sources by enabling organizations to govern how they get data from different sources, curate it, and share it across the enterprise, Tahiliani observed.
Metadata plays a couple different roles in this basic functionality. Its the key to connecting data to various downstream systems through MDMs logical model. Firms can also use the power of metadata to help them discover elements that need to be mastered, rapidly provide integration between the system using that metadata knowledge and share that data, as well as use metadata to use machine learning capabilities around matching and data quality rules, Tahiliani commented. These capabilities are instrumental for combining sources for customer insight, for example, in a well governed fashion.
Data Cataloging
Metadata also helps the data cataloging process Tahiliani described, which identifies what data are where in cloud settings. In fact, there are several cloud tools that reinforce this aspect of data governance by empowering organizations so the first thing they do is catalog what they have, Melcher noted. Now that weve cataloged it, lets start classifying and understanding what it is and figuring out whats sensitive and who knows about the data here and there. Once data are classified, its relatively straightforward to tag them and ascribe ownership to them, which provides the framework for suitably governing data across any variety of cloud sources.
Credible catalogs can point at sources to provide this functionality or allow sources to input their metadata for cataloging purposes. The latter is particularly effectual with MDM because it lets organizations push the logical definitions of Master Dataand register assets to what the end user would actually expect, Melcher mentioned. The benefit of this approach is data already adheres to MDM conventions of data quality, definitions, and other aspects of its logical model, which aids in creating catalogs and classifications that enhance enterprise use caseswhile ensuring those use cases are well governed.
Data Modeling
MDM is also useful for unifying the different data models that abound throughout cloud sources so organizations can combine them for analytics, for example. Data modeling is a pivotal aspect of data governance that can be particularly time consuming (delaying time to value) with traditional methods. MDM, however, lets organizations create models for products or any other domain partly by using data discovery mechanisms. As youre creating the models and what attributes you need to have, the discovery aspect allows you to understand across the source systems for those different entities whats the metadata thats being held, Tahiliani explained. That provides information about how often its used and updated to really determine if that should be an attribute within your master data model.
The most expressive data models unify terminology and include specific vocabularies to describe customers or supply chain needs, for instance. The metadata intelligence Tahiliani alluded to assists this facet of data modeling while allowing organizations to redress disparities in how entities are represented when you know which source systems the metadata youre pulling in is from, and what the vocabulary is across those source systems, Tahiliani added.
Data Stewardship
All of the above dimensions of data governancemetadata intelligence, data cataloging, and data modelingare invaluable to data stewards attempting to ensure governance standards are met across the enterprise in heterogeneous cloud, edge, and on-premise applications. The enhanced data modeling traits pertaining to definitions and attributes that are found in MDM (including information from data cataloging tools) are critical to data stewards validating data governance protocols.
For example, if a steward needs at-a-glance information about the data model for a customer entity, hell see this is a description of a customer associated with this glossary term called business partner or whatever the case may be, and Bob in Sales is an expert on what customer is, Melcher indicated. With this approach MDM participates in governance by pushingmetadata into a catalog, but then, as governance occurs, the output of governance is then available for end users to consume back in [MDM], Melcher concluded.
Master Data Governance
Data governance is indispensable for making good on the clouds premise of real-time access to a distributed data landscape with elastic scalability in a pay-per-use pricing model. Additionally, the cloud contains many toolssuch as cloud native MDM solutions, data cataloging instruments, and othersthat streamline governance capabilities to decrease datas risk while boosting its enterprise value over the long term.
MDM sits between each of these constructs as a viable means of balancing the usefulness of one with the other. You shouldnt do governance in MDM, Melcher cautioned. You should do governance and MDM. You should be governing across more than just your Master Data Management platform.
About the Author
Jelani Harper is an editorial consultant servicing the information technology market. He specializes in data-driven applications focused on semantic technologies, data governance and analytics.
Sign up for the free insideBIGDATAnewsletter.
Join us on Twitter:@InsideBigData1 https://twitter.com/InsideBigData1
More:
What Data Governance Means to Cloud Computing (And Vice Versa) - insideBIGDATA
Posted in Cloud Computing
Comments Off on What Data Governance Means to Cloud Computing (And Vice Versa) – insideBIGDATA
After JEDI cancellation, military expands competition for cloud computing that can scale to defense needs – Intelligent Aerospace
Posted: at 12:44 am
WASHINGTON - Now that the U.S. Department of Defense (DOD) officials has canceled their signature $10 billion enterprise cloud computing contract, whats next for the militarys cloud needs? Patrick Tucker writes for Defense One.Continue reading original article.
The Intelligent Aerospace take:
July 19, 2021 -DOD officials say that the solution will look a lot like a marriage between whats being offered by Microsofts Azure and Amazon Web Services. It wont be the single massive cloud envisioned as the Joint Enterprise Defense Infrastructure, affectionately called JEDI.
Scale is the operative word. DOD officials have made clear that theyre not going back to the old days of many clouds from tiny vendors with no central cloud environment for data access and distribution. A cloud on the scale of what Amazon, Microsoft, or Google can provide is essential to realize the Pentagons dream of joint all-domain command and control, or JADC2.
Today, the pathway looks like a joint Amazon and Microsoft cloud, the two largest companies that were fighting over the JEDI contract. Amazon and Microsoft likely will receive contracts under a new program called the Joint Warfighter Cloud Capability. But they wouldn't be the only ones.
Related: Civilian and military aircraft get avionics upgrades
Related: The essentials of trusted computing and cyber security
Related: Lockheed Martin flies mission-enabling Kubernetes onboard U-2
Jamie Whitney, Associate EditorIntelligent Aerospace
Excerpt from:
Posted in Cloud Computing
Comments Off on After JEDI cancellation, military expands competition for cloud computing that can scale to defense needs – Intelligent Aerospace
Data Theorem Showcasing Product of the Year with Hacker Toolkits in Advanced Security Training Session During Black Hat USA 2021 – Business Wire
Posted: at 12:44 am
PALO ALTO, Calif.--(BUSINESS WIRE)--Data Theorem, Inc., a leading provider of modern application security, today announced that TMC, a global, integrated media company, has awarded Data Theorem with a 2021 Product of the Year Award, presented by Cloud Computing Magazine. The company also announced that its award-winning AppSec solutions will be showcased in a free advanced security training session during Black Hat USA 2021.
Data Theorems AppSec experts will host the training session, titled GraphQL DDoS and OTP Bypass Attacks on Aug. 4 at 11 am PDT / 2 pm EDT. For more information and to register, see: https://resource.datatheorem.com/graphql
The 2021 Product of the Year Award, presented earlier by Cloud Computing Magazine, adds to the decorated list of distinctions for Data Theorem, being recognized for its unique dynamic and run-time analysis with offensive attack surface management and defensive protection toolkits. The unique solutions are differentiated in enabling organizations to conduct continuous, automated security inspection and remediation of their most important cloud-native applications.
Congratulations to Data Theorem for being honored with a Cloud Computing Product of the Year Award, said Rich Tehrani, CEO, TMC. Data Theorems AppSec solutions are truly innovative products, and are among the best solutions available within the past twelve months that facilitate business-transforming cloud computing and communications. I look forward to continued excellence from Data Theorem.
Data Theorems latest offering, Cloud Secure, is the industrys first solution delivering attack surface management security for cloud-native applications that starts at the client layer (mobile and web), protects the network layer (REST and GraphQL APIs), and extends down through the underlying infrastructure (cloud services). Its combination of attack surface management and defensive protections enables both offensive and defensive security capabilities to best prevent data breaches of cloud-native applications and serverless cloud functions.
It is rewarding to accept yet another award for Data Theorem, this time being named a 2021 Product of the Year for our unique design to help customers secure their modern cloud applications, said Doug Dooley, Data Theorem COO. Cloud Secure joins Data Theorems portfolio of award-winning AppSec solutions. Our hacker toolkits continue to showcase cloud-native application attack insights, which we will demonstrate in our upcoming GraphQL DDoS and OTP Bypass Attacks session during Black Hat. We are pleased to offer this free advanced security training opportunity to help educate the security research community on some of todays modern AppSec attacks, and prevention techniques.
Data Theorems broad AppSec portfolio protects organizations from data breaches with application security testing and protection for modern web frameworks, API-driven microservices and cloud resources. Its solutions are powered by its award-winning Analyzer Engine, which leverages a new type of dynamic and run-time analysis that is fully integrated into the CI/CD process, and enables organizations to conduct continuous, automated security inspection and remediation.
About Cloud Computing Magazine
Cloud Computing magazine is the industrys definitive source for all things cloud from public, community, hybrid and private cloud to security and business continuity, and everything in between. This quarterly magazine published by TMC assesses the most important developments in cloud computing not only as they relate to IT, but to the business landscape as a whole.
About Data Theorem
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The award-winning Data Theorem Analyzer Engine continuously analyzes APIs, Web, Mobile, and Cloud applications in search of security flaws and data privacy gaps. Data Theorem products help organizations prevent AppSec data breaches. The company has detected more than 1 billion application eavesdropping incidents and currently secures more than 8,000 modern applications for its enterprise customers around the world. Data Theorem is headquartered in Palo Alto, Calif., with offices in New York and Paris. For more information visit http://www.datatheorem.com.
Data Theorem and TrustKit are trademarks of Data Theorem, Inc. All other trademarks are the property of their respective owners.
Go here to see the original:
Posted in Cloud Computing
Comments Off on Data Theorem Showcasing Product of the Year with Hacker Toolkits in Advanced Security Training Session During Black Hat USA 2021 – Business Wire
The 10 Biggest Cloud Outages Of 2021 (So Far) – CRN
Posted: at 12:44 am
Verizon, Microsoft and Google were just some cloud providers to see their services interrupted so far this year from a variety of issues, from a change in the authentication system to a deadly winter storm. In the cloud computing era, some experts say we can only expect more outages but with less severity.
Miles Ward, chief technology officer at Los Angeles-based Google partner SADA Systems, told CRN that cloud outages can prove less disastrous than when data centers have issues. With cloud-related issues, providers can fix the problem in parallel with a users team, whereas data centers can require an internal team to fix problems.
Outages can mean the end for companies, depending on their choices in design and deployment, or they can be complete non-events, Ward said. Cloud has changed the nature of outages.
As cloud adoption and the number of regions, zones and cloud services grow, everyone should prepare for more outages, Ward said. But he expects the type of global, all-service outages that garner headlines to decrease.
Every cloud engineering team has seen how impossible it is for customers to engineer around these kinds of outages and is working hard to distribute, subdivide, and make fault-tolerant these central services, Ward said. The result may be a shift of focus where you might see even more minor failures in singleton services, while the global services survive seemingly unaffected by minor failures because of this investment in resilience.
Companies today need copies of their data in distant regions, to run instances in multiple zones and automation to cut down on the time it takes to fix an outage, Ward said. At SADA, even demos are designed with high availability to run across Google Cloud and AWS.
In the meantime, CRN has collected a list of some of the largest cloud outages and issues to hit computers this year. Heres what you need to know.
For more of the biggest startups, products and news stories of 2021 so far, click here.
View post:
Posted in Cloud Computing
Comments Off on The 10 Biggest Cloud Outages Of 2021 (So Far) – CRN
Army Engineers, Microsoft to Analyze Extreme Weather Risk Using Cloud-based Analytics – HPCwire
Posted: at 12:44 am
VICKSBURG, Miss., July 20, 2021 Modeling the risk of extreme weather and natural disasters along the nations coastline is critical to the U.S. Army Engineer Research and Development Center (ERDC) mission of delivering innovative solutions for safer, better world.
Increasing this modeling capacity and better dissemination of data to climate research is the goal of a new agreement between the ERDC and Microsoft Corporation. This government/industry collaboration is aimed at improving climate modeling and natural disaster resilience planning through the use of predictive analytics-powered, cloud-based tools and Artificial Intelligence (AI) services.
The agreement seeks to demonstrate the scalability of the code of ERDCs premier coastal storm modeling system, CSTORM-MS, inside Microsofts Azure Government, a cloud computing service for building, testing, deploying and managing applications and services through Microsoft-managed data centers specifically for the U.S. Government. CSTORM-MS is a comprehensive integrated system of highly skilled and highly resolved models used to simulate coastal storms. The models provide for a robust, standardized approach to establishing the risk of coastal communities to future occurrences of storm events and for evaluating flood risk reduction measures. With its physics-based modeling capabilities, CSTORM-MS integrates a suite of high-fidelity storm modeling tools to support a wide range of coastal engineering needs for simulating tropical and extra-tropical storms, as well as wind, wave and water levels.
Currently, CSTORM-MS models are run at ERDCs Department of Defense Supercomputing Resource Center, one of the DoD High Performance Modernization Programs (HPCMP) supercomputing centers. In 2020, ERDC and the HPCMP performed a commercial cloud for high-performance computing workload assessment. This initial testing included a feasibility study of the CSTORM-MS models, and was successfully conducted using Microsofts Azure cloud.
Through the Cooperative Research and Development Agreement (CRADA) between ERDC and Microsoft, two goals have been set for this second phase of the project:
Microsofts participation in this effort stems from their Microsoft AI for Earth, a working group within Microsoft established in June 2017 that provides cloud-based tools and AI services to organizations working to protect the planet across five key areas: agriculture, biodiversity, conservation, climate change and water. AI for Earth awards grants to support projects that use AI to change the way people and organizations monitor, model and manage Earths natural systems.
The CRADA between ERDC and Microsoft is made possible through the Federal Technology Transfer Act of 1986. The act provides that federal laboratories developments, such as those of ERDC, should be made accessible to private industry and state and local governments for the purpose of improving the economic, environmental and social well-being of the United States by stimulating the use of federally funded technology developments or capabilities.
Source: ERDC
Read more:
Army Engineers, Microsoft to Analyze Extreme Weather Risk Using Cloud-based Analytics - HPCwire
Posted in Cloud Computing
Comments Off on Army Engineers, Microsoft to Analyze Extreme Weather Risk Using Cloud-based Analytics – HPCwire