TIBCO Software Is About To Gamify Analytics – CDOTrends

The health pandemic showed how much data science can help businesses and the glaring gaps that currently stop them from doing so.

It is not a new problem. Before the pandemic, C-level executives grumbled over data science teams' value in delivering actionable insights to businesses. Simultaneously, data scientists complained about data pipelines, DataOps difficulties, and integration with open source. The pandemic only made these shortfalls obvious.

TIBCO Software is looking to change this. And it announced a slew of new announcements that optimize data analytics to different business personas and bringing data science a step closer to playing an online game literally.

Partners Level Up

One key message that rang throughout TIBCO NOW, the company's annual event now gone virtual, is the partners' role.

"You'll see a story where we'll talk more and more about better together, about how our partners are becoming an immensely important part of TIBCO, especially in APJ. And also, how we're working with our partners, not just with our customers, to drive innovation," said Robert Merlicek, chief technology officer for APJ at TIBCO Software.

The partner program is now getting a significant upgrade. The reason: TIBCO Software sees partners as a competitive differentiator. "One of the things we've realized is that partners are more important as part of the ecosystem; they bring, not only capabilities but IP and domain knowledge, etc. So, you'll see these initiatives that have been set up," said Merlicek.

For example, Nick Lim, general manager for APJ at TIBCO Software, noted that his company is pushing its TIBCO Labs Partner Program. Previously, the Labs program involved key clients around the development of unique solutions for the business. Now, partners can join this program to develop unique value propositions. For CDOs and data science teams, it means that they can leverage key domain-specific or technology-specific solutions (e.g., blockchain in trade finance).

Merlicek describes this effort as bringing partners along "that journey with us around innovation."

Platforming on Spotfire

TIBCO is also becoming a platform. And they found it with the new release of TIBCO Spotfire 11. By transforming its most valuable product into a platform, the company hopes to drive its so-called Hyperconverged Analytics experience.

The reasons are practical. CDOs and data science teams know that their companies are drowning in data. Piecing together a complete picture across the different pockets of data stores in an organization is a common complaint.

Different data stores also create data gravity problems, where colossal data sets require data analytics tools to be deployed closer to them to overcome lag. This becomes an additional issue as companies start to move to edge computing and IoT analytics to finetune their business agility and dive into machine learning.

Even with this data, companies still struggle to unlock insights within their data sets. The pandemic is making this urgent as companies see innovation as their only means of survival. Good data management also allows them to deploy machine learning to do more with less and be proactive.

"And Spotfire version 11 has brought new capability to the vast majority of people to make it easier to get insights to leverage machine learning," said Merlicek.

At TIBCO NOW, the analytics giant announced its Hyperconverged Analytics vision, which it sees as shortening the time to insights. Spotfire 11 is positioned as the platform to shorten the time to custom analytics creation. Together, they create an environment that melds data management and data science with real-time data "to drive business insights, decisions, and actions," the company said in a press release.

The platform ecosystem

Spotfire is not alone. Three pivotal announcements aim to enhance the platform's use, mainly TIBCO Cloud Data Streams, TIBCO Any Data Hub, and TIBCO Spotfire Mods.

TIBCO Software claimed that its Cloud Data Streams "is one of the first cloud-based, real-time streaming BI platforms, delivering click-simple access to real-time data for visualizations in Spotfire." For CDOs and data science teams, they can easily detect emerging trends and patterns in real-time while being proactive to future conditions a key ask during pandemic times as companies navigate a rapidly changing landscape.

TIBCO Any Data Hub offers a data management blueprint for distributed data environments. The goals are accuracy and consistency, which can improve trust and better align data science output for business needs. By integrating with TIBCO Cloud Integration, TIBCO Cloud Mashery, TIBCO Cloud Metadata Management, and TIBCO EBX, it provides an all-in-one approach for data definitions. For CDOs and data science teams, it means "one data catalog, one point of access, one security model, and one governance system," said the press release.

Spotfire Mods, which takes a page from MODs made famous by games like Civilization, builds on TIBCO Software persona-based approach to analytics. The offering allows the development of purpose-built analytic apps that cater to different business users or industries.

This is an important step forward as apps can include both cloud-native or legacy components, allowing different personas to gain insights irrespective of the platform while accelerating reuse.

"I think this is the evolution of the analytical demand that we're seeing for a lot of customers. So, the notion is, I don't want to be reinventing the wheel; I don't want to be building custom base analytics capabilities on top of my platform and rebuilding those all the time and [creating] different versions. So, it's very much like the version of the app store," said Merlicek.

Multiplayer options

Another key challenge lies in open source integration. CDO and data scientists continue to grapple with integration with new features or tools while keeping platform complexity minimum.

"We have seen the vast rise in [Apache] Kafka, a lot of big data repositories around Hadoop, and we saw various open source technologies across a lot of different segments. Now, if you put your customer hat on, I've got information spread across everywhere. Now, it's important that you get access to those silos of data. And so what you'll see from a TIBCO perspective is that there's a vast number of open-end connectivity options in our analytics," said Merlicek.

During TIBCO NOW, the company announced TIBCO Data Virtualization 8.3 that supports a significantly expanded set of data adapters. Data science teams can connect to over 300 data sources, including support for streaming sources such as Apache Kafka, OSIsoft, and OPC Unified Architecture. It is also available on the Microsoft Azure Marketplace and AWS Marketplace and has a Dockerfile option.

Future-proofing

TIBCO Software moves come as the global industry looks for ways to embed analytics at all decision-making levels. They want analytics capabilities at all levels, from the front-office to decision-makers. Adding streaming IoT data to the mix can help companies become more responsive, but it also adds more complexities for data science teams.

Taking a platform approach, upgrading its partner ecosystem's breadth and depth, and embracing connectors and open source software makes sense. It allows data science teams to go beyond science and start doing more with the data. The days of bragging about bigger data science teams or having fancy visual analytics of reports are long gone.

The pandemic has also shifted how C-level engage with data. Dashboards and KPI monitoring fell short when global supply chains saw demands plummet, customer demands shift, and digitalization moved to an organization's front and center. As data-driven C-levels want more out of their data, TIBCO software is looking to plug the analytics gap between data and decision sciences.

Will customers bite into TIBCO Software's new play? It is too early, but the data analytics giant says game on.

Image credit: iStockphoto/gorodenkoff

See original here:
TIBCO Software Is About To Gamify Analytics - CDOTrends

Will technology save humanity? Researchers gather at global summit on bleeding-edge tech for good – SiliconANGLE News

Brilliant minds have shaped the course of human history. From the astrolabe to the internet, innovation has been a defining trait of our species. Now, with the Western world on the edge of what, at times, seems like an apocalyptic future, can we harness humanitys super intelligence and create tech that benefits the species as a whole rather than destroying the planet to line the pockets of a few?

Some huge, big, fundamental change has to happen to sustain our long-term development of ideas and, basically, for the sake of human beings, said Kazuhiro Gomi (pictured), president and chief executive officer of NTT Research Inc.

Gomi spoke with Jeff Frick, host of theCUBE, SiliconANGLE Medias livestreaming studio, during Upgrade 2020, the NTT Research Summit. They discussed basic research, NTTs operational goals, and the Upgrade 2020 summit. (* Disclosure below.)

Following in the footsteps of the venerable AT&Ts Bell Labs, NTT Research is a subsidiary of Japans Nippon Telegraph and Telephone Corp. (NTT). But the companys mission extends beyond industry to promoting positive change for humanity through technological innovation.

NTT opened a lab in Silicon Valley in 2019 to facilitate global collaboration within the basic research community. The Upgrade 2020Global Research Summit is a way for the company to demonstrate what they are doing and invite the world to add to the conversation.

For us, basic research means that we dont necessarily have a product roadmap or commercialization roadmap. We just want to look at the fundamental core technology of all things, Gomi said.

NTTs research focuses on quantum computing; cryptography and information security; and medical and health informatics. The Summits agenda reflects these areas, with day one devoted to an overview, followed by three days of deep dives into physics and informatics, cryptography and information security, and medical and health informatics.

Day one will be a great day to understand more holistically what we are doing, Gomi said. However, given the type of research topic that we are tackling, we need the deep dive conversations, very specific to each topic by the specialist and the experts in each field.

Day two kicks off with a session titled Coherent Nonlinear Dynamics and Combinatorial Optimization, given by Stanford professor of applied physicsHideo Mabuchi. Other equally in-depth discussions include research into biological digital twins. Basically, the computer system can simulate or emulate your own body, not just a generic human body, Gomi explained. If you get that precise simulation of your body, you can do a lot of things.

The ability to predict future illnesses or physical problems as a body ages is one scenario. Another is testing medicines using a medical doppelgnger to eliminate human risk. The technology is in its infancy, but the potential is definitely ground-breaking.

Its going to be a pretty long journey, Gomi stated. Were starting from trying to get the digital twin for the cardiovascular system, so basically to create your own heart.

Collaboration with professors and researchers at prestigious universities is essential for the mission of NTT, and the summit has a roster of high-level academics from MIT, UCLA, Caltech and Stanford, as well as Leicester University in the U.K. and Keio University in Tokyo.

Listening in to those sessions you will learn whats going on from the NTT [researcher]s mind or entity researchers mind to tackle each problem. But at the same time you will get to hear top-level researchers and professors in each field, Gomi said, whooffered an open invitation for anyone tojoin the summitand reach out and continue the conversation by contacting him and the other researchers at NTT.

I believe this is going to be a unique [summit] to understand whats its like in the research fields of quantum computing, encryptions, and then medical informatics of the world, Gomi concluded.

Watch the complete video interview below, and be sure to check out more of SiliconANGLEs and theCUBEs coverage of Upgrade 2020, the NTT Research Summit. (* Disclosure: TheCUBE is a paid media partner for Upgrade 2020, the NTT Research Summit. Neither NTT Research, the sponsor for theCUBEs event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant enterprise and emerging technology content to you. Thanks!

Support our mission: >>>>>> SUBSCRIBE NOW >>>>>> to our YouTube channel.

Wed also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we dont have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary onSiliconANGLE along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams attheCUBE take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here,please take a moment to check out a sample of the video content supported by our sponsors,tweet your support, and keep coming back toSiliconANGLE.

Read the rest here:
Will technology save humanity? Researchers gather at global summit on bleeding-edge tech for good - SiliconANGLE News

Quantum Cryptography Market 2020 Size, Demand, Trends and Growth by Business Opportunities, Latest Innovation, Technology Trends and Forecast 2025 -…

Quantum Cryptography Market analysis is provided for the international markets including development trends, competitive landscape analysis, geography, end-users, applications, market share, COVID-19 analysis, and forecast 2020-2025. The predictions estimated in the market report have been resulted in using proven research techniques, methodologies, and assumptions. This Quantum Cryptography market report states the market overview, historical data along with size, growth, share, demand, and revenue of the global industry.

Further, Quantum Cryptography Market report also covers the development policies and plans, manufacturing processes and cost structures, marketing strategies followed by top Quantum Cryptography players, distributors analysis, Quantum Cryptography marketing channels, potential buyers and Quantum Cryptography development history. This report also states import/export, supply and consumption figures as well as cost, price, revenue and gross margin by regions.

Get sample copy of Quantum Cryptography Market report @ https://www.adroitmarketresearch.com/contacts/request-sample/958

In addition, the market research industry delivers the detailed analysis of the global Quantum Cryptography market for the estimated forecast period. The market research study delivers deep insights about the different market segments based on the end-use, types and geography. One of the most crucial feature of any report is its geographical segmentation of the market that consists of all the key regions. This section majorly focuses over several developments taking place in the region including substantial development and how are these developments affecting the market. Regional analysis provides a thorough knowledge about the opportunities in business, market status & forecast, possibility of generating revenue, regional market by different end users as well as types and future forecast of upcoming years.

Top Leading Key Players are:

ID Quantique, MagiQ Technologies, Infineon Technologies, QuintenssenceLabs, Crypta Labs, ISARA, Toshiba, Microsoft, IBM, HP, PQ Solutions, and Qubitekk.

Browse the complete report @ https://www.adroitmarketresearch.com/industry-reports/quantum-cryptography-market

Leading players of the global Quantum Cryptography market are analyzed taking into account their market share, up to date developments, new product launches, partnerships, mergers or acquisitions, and markets served. We also provide a comprehensive analysis of their product portfolios to explore the products and applications they concentrate on when operating in the global Quantum Cryptography market. in addition, the report offers two separate market forecasts one for the production side and another for the consumption side of the global Quantum Cryptography market.

Based on application, the market has been segmented into:

NA

The study analyses numerous factors that are influencing the Quantum Cryptography market from supply and demand side and further evaluates market dynamics that are impelling the market growth over the prediction period. In addition to this, the Quantum Cryptography market report provides inclusive analysis of the SWOT and PEST tools for all the major regions such as North America, Europe, Asia Pacific, and the Middle East and Africa. The report offers regional expansion of the industry with their product analysis, market share, and brand specifications. Furthermore, the Quantum Cryptography market study offers an extensive analysis of the political, economic, and technological factors impelling the growth of the market across these economies.

The research provides answers to the following key questions:What is the estimated growth rate of the market for the forecast period 20192025? What will be the market size during the estimated period?What are the key driving forces responsible for shaping the fate of the Quantum Cryptography market during the forecast period?Who are the major market vendors and what are the winning strategies that have helped them occupy a strong foothold in the Quantum Cryptography market?What are the prominent market trends influencing the development of the Quantum Cryptography market across different regions?What are the major threats and challenges likely to act as a barrier in the growth of the Quantum Cryptography market?What are the major opportunities the market leaders can rely on to gain success and profitability?

For Any Query on the Quantum Cryptography Market: https://www.adroitmarketresearch.com/contacts/enquiry-before-buying/958

About Us :

Contact Us :

Ryan JohnsonAccount Manager Global3131 McKinney Ave Ste 600, Dallas,TX 75204, U.S.APhone No.: USA: +1 972-362 -8199 / +91 9665341414

More:
Quantum Cryptography Market 2020 Size, Demand, Trends and Growth by Business Opportunities, Latest Innovation, Technology Trends and Forecast 2025 -...

The Top Internet of Things (IoT) Authentication Methods and Options – Security Boulevard

Gartner recently labeled Internet of Things Authentication as a high benefit in 2020 Gartner Hype Cycle for IAM Technologies. This blog covers your options for Internet of Things Authentication.

Want to read the report? Skip the blog and click Download Report below.

IoT authentication is a model for building trust in the identity of IoT machines and devicesto protectdataand control access wheninformation travelsvia an unsecured network such as the Internet.

Strong IoT authentication is needed so that connected IoTdevices andmachines can be trusted to protect against control commands from unauthorized usersordevices.

Authentication also helps prevent attackers from claiming to be IoT devices in the hope of accessing data on servers such as recorded conversations, images, and other potentially sensitive information.

There are several methods by which we can achieve strong authenticationto secureIoT device communications:

The Internet of Things (IoT) is not just a single technology, but a connected environment of various machines (things) that work together independently without human interaction.

The authorization process is the tool used to validate the identity ofeach endpoint in the IoT system. The certification process is configureduponenrollment entry and informs the service provider of the method to be used when checking the systems identity during registration.

Machine Identity Management aims to build and manage confidence in a machines identity that interacts with other devices, applications, clouds, and gateways.

This may include the authentication and authorization of IoT devices such as:

Each IoTmachineneeds a unique digital identity when connecting to a gateway or a central server to prevent malicious actors from gaining control of the system.This is accomplished through binding an identity to a cryptographic key, unique per IoT device.

Machine identity management approaches are specifically responsible for discovering the credentials used by machines and the management of their life cycle.

IoT devices are often hacked remotely, involving a hacker trying to enter the device using an internet connection. If an IoT device is only allowed to communicate with an authenticated server, any outside attempts to communicate will be ignored.

According to the 2018 Symantec threat report, the number of IoT attacks increased by 600 percent between 2016 and 2017, from 6,000 to 50,000 attacks, respectively.

Therefore, when IoTdevices areimplemented within corporate networks,,security needs to be given much more attention. To address this issue, powerful but efficient cryptography solutions must be used to standardize secure communication between machines.

However, it is a tough decision to choose the right IoT authentication model for the job. Before deciding whicharchitecturemodel is ultimately the best IoT authentication, you need toconsiderseveralfactors, such as energy resources, hardware capacity, financial budgets, security expertise, security requirements, and connectivity.

The X.509 protocol (IETF RFC 5280) provides the most secure digital identity authentication type and is based on the certificate chain of trust model. The use of X.509 certificates as a certification mechanism is an excellent way to scale up production and simplify equipment delivery.

Public key infrastructure (PKI) consists of a tree-like structure of servers and devices that maintain a list of trusted root certificates. Each certificate contains the devices public key and is signed with the CA private key. A unique thumbprint provides a unique identity that can be validated by running a crypto algorithm, such as RSA.

Digital certificates are typically arranged in a chain of certificates in which each certificate is signed by the private key of another trusted certificate, and the chain must return to a globally trusted root certificate. This arrangement establishes a delegated chain of trust from the trusted root certificate authority (CA) to the final entity leaf certificate installed on the device through each intermediate CA.

It requires a lot of management control, but there are many vendor options out there.

However, X.509 certificate lifecycle management can be a challenge due to the logistical complexities involved and comes at a price, adding to the overall solution cost. For this reason, many customers rely on external vendors for certificatesand lifecycle automation.

The Hardware Security Module, or HSM, is used for secure, hardware-based device secret storage and is the safest form of secret storage. Both the X.509 certificate and the SAS token can be stored in the HSM. HSMs may be used with the two attestation mechanisms supported by the provisioning service.

Alternatively, device secrets may also be stored in software (memory) but is a less secure form of storage compared to an HSM.

It is essential to check the devices identity that communicates with the messaging gateway in IoT authentication deployments. The usual method is to generate key pairs for devices that are then used to authenticate and encrypt traffic. However, the disk-based key pairs are susceptible to tampering.

TPMs come ina number ofdifferent forms, including:

While a typical TPM has several cryptographic capabilities, three key features are relevant to IoT authentication:

Device manufacturerscannotalways have full confidence in all entities in their supply chain (for example, offshore assembly plants). Still, theycannotgive up the economic benefits of using low-cost suppliers and facilities. The TPM can be used at various points along the supply chain to verify that the device has not been incorrectly modified.

The TPMhas the capability to storethe keyssecurelyin the tamper-resistant hardware. The keys are generated within the TPM itself and are therefore protected from being retrieved by external programs. Even without harnessing the capabilities of a trusted hardware root and a secure boot, the TPM is just as valuable as a hardware key store. Private keys are protected by hardware and offer much better protection than a software key.

With TPM, you cant roll the key without destroying the identity of the chip and giving it a new one. Its like if you had a clone,yourclone would have the same physical characteristics as you, but theyre a different person in the end. Although the physical chip remains the same, your IoT solution has a new identity.

Some key differences between TPMs and symmetric keys (discussed further below) are as follows:

Symmetric Key Certification is a simple approach to authenticating a device with a Device Provisioning Service instance. This certification method is the Hello World experience for developers who are new to or do not have strict safety requirements. Device attestation using a TPM or an X.509 certificate is more secure and should be used for more stringent safety requirements.

Symmetric key enrollments also provide a great way for legacy devices with limited security features to boot into the cloud via Azure IoT.

The symmetric key attestation with the Device Provisioning Service is carried out using the same security tokens supported by IoT hubs to identify the devices. These security tokens are SAS (Shared Access Signature) tokens.

SAS tokens have a hashed signature created using a symmetric key. The signature shall be recreated by the Device Provisioning Service to verifywhether or notthe security token presented during the certification is authentic.

When the device certifies with an individual enrollment, the device uses the symmetric key defined in the individual enrollment entry to create a hashed signature for the SAS token.

Shared symmetric keys may beless secure than X.509 or TPM certificates because the same key is shared between the device and the cloud, which means that the key needs to be protected in two places.Designers usingsymmetric keyssometimeshardcode the clear (unencrypted) keys on the device, leaving the keys vulnerable, which is not a recommended practice

Properimplementation of IoT authenticationhasmany beneficial effects on IoT security. However, choosing the right method can be challenging, and the wrong choice can increase risks by tenfold.

Some riskscan be mitigated by securely storing the symmetric key on the deviceand following best practices around key storage,Its not impossible, butwhensymmetric keys areused solely,theycan beless secure then HSM, TPM, and X.509 implementations.

In the case of certificates, HSM, TPMs, and X.509applications, the main challenge is to prove possession of the key without revealing the keys private portion.

More:
The Top Internet of Things (IoT) Authentication Methods and Options - Security Boulevard

Fintech Harvest to Offer AI and Machine Learning enhanced Financial Wellness Reports – Crowdfund Insider

US-based Fintech firm Harvest is reportedly planning to transform the existing credit scoring systems.

Like many other Fintechs, Harvest will be using artificial intelligence (AI) and machine learning (ML) algorithms to offer clients a dynamic view of their financial profile. This should help people with creating their financial wellness plans. It will also assist them with making informed decisions throughout the lifetime of any loans they decide to take out.

Harvests PRO Index (PariFi Rating & Opportunity Index) takes into consideration many different factors along with a clients credit score so that individuals and businesses are able to make better assessments and decisions related to managing their finances.

The Harvest index uses advanced AI and machine learning algorithms to create a holistic or thorough financial plan for each client based on their credit score, income level, and spending habits. The AI is able to automatically negotiate or settle banking fees and interest charges. The software can also manage and identify recurring payments.

The platform has been designed to provide a central picture of all of a clients financial profile. It provides an easy-to-read summary of their different assets, debts, and expenses in one convenient place. It includes action items so that they can take the necessary steps to improve the financial wellness plan. Customers who begin using the index are able to access special tools that help with automatically obtaining refunds from bank fees and various interest charges. Users are also able to better manage their debt and analyze their day-to-day or monthly expenses.

Harvest also offers the Ability-to-Pay as a Service (A2PaaS),. This option allows users to gain access to real-time transactional data. It also offers an overview of a clients financial status. This feature allows lenders to identify pre-arrears clients a lot faster than using traditional methods and also improves recovery.

Nami Baral, CEO of Harvest, stated:

Consumers are facing an unparalleled time of insecurity around their finances and the future. Were launching the PRO Index now to help them truly understand their baseline. Our hope is that this index and our platform will empower the majority to take control of and secure their hard-earned money.

View original post here:
Fintech Harvest to Offer AI and Machine Learning enhanced Financial Wellness Reports - Crowdfund Insider

GreenFlux, Eneco eMobility and Royal HaskoningDHV implement smart charging based on machine learning – Green Car Congress

Royal HaskoningDHVs office in the city of Amersfoort, the Netherlands, is the first location in the world where electric vehicles are smart charged using machine learning. The charging stations are managed by the charging point operator Eneco eMobility, with smart charging technology provided by the GreenFlux platform.

With the number of electric vehicles ever increasing, so is the pressure to increase the number of charging stations on office premises. This comes at a cost; electric vehicles require a significant amount of power, which can lead to high investments in the electrical installation. With smart charging these costs can be significantly reduced, by ensuring that not all vehicles charge at the same time.

With the innovation, developed by GreenFlux, deployed by Eneco eMobility and applied at Royal HaskoningDHVs Living Lab Charging Plaza in Amersfoort, the Netherlands, smart charging is now taken to the next level, allowing up to three times more charging stations on a site than with regular smart charging.

The novelty in this solution is that machine learning is used to determine or estimate how charge station sites are wired physicallydata that commonly is incomplete and unreliable. At Royal HaskoningDHV, the algorithm determines over time the topology of how all the three-phase electricity cables are connected to each individual charge station.

Using this topology, the algorithm can optimize between single and three phase charging electric vehicles. Though this may seem like a technicality, it allows up to three times as many charging stations to be installed on the same electrical infrastructure.

Now that this part has been tested and proven, there is so much more we can add. We can use the same technology to, for instance, predict a drivers departure time or how much energy they will need. With these kinds of inputs, we can optimize the charging experience even further.

Lennart Verheijen, head of innovation at GreenFlux

See the original post:
GreenFlux, Eneco eMobility and Royal HaskoningDHV implement smart charging based on machine learning - Green Car Congress

Collaboration Will Offer Data to Train Machine Learning Tools – HealthITAnalytics.com

September 28, 2020 -Researchers at the University of Iowa (UI) have received a $1 million grant from the National Science Foundation (NSF) to develop a machine learning platform to train algorithms with data from around the world.

The phase one grant will enable the UI team to lead a multi-university and industry collaboration and address concerns around patient privacy and data security in clinical AI development.

The researchers noted that although the use of AI is widespread in healthcare, training effective machine learning algorithms require thousands of samples annotated by doctors. This can lead to privacy and security issues, the team stated.

Traditional methods of machine learning require a centralized database where patient data can be directly accessed for training a machine learning model, said Stephen Baek, assistant professor of industrial and systems engineering at UI.

Such methods are impacted by practical issues such as patient privacy, information security, data ownership, and the burden on hospitals which must create and maintain these centralized databases.

The team will develop a decentralized, asynchronous solution called ImagiQ, which relies on an ecosystem of machine learning models so that institutions can select models that work best for their populations. Organizations will be able to upload and share the models, not patient data, with each other.

As each institution improves the model using their local patient data sets, models will be uploaded back to a centralized server. This ensemble learning approach will allow the most reliable and efficient models to come to the forefront, resulting in a better AI system for analyzing images like lung x-rays or CT scans that detect tumors.

The UI-led team includes researchers from Stanford University, the University of Chicago, Harvard University, Yale University, and Seoul National University.

Over the next nine months, the group will aim to develop a prototype of the system as well as participate in the Accelerators innovation curriculum to ensure the solution has societal impact. By the end of phase one, the team will participate in a pitch competition and proposal evaluation and if selected will proceed to phase two, with potential funding up to $5 million for 24 months.

ImagiQ will further federated learning by decentralizing the model updates and eliminating the synchronous update cycle, said Baek. We are going to create a whole ecosystem of machine learning models that will evolve and improve over time. High performing models will be selected by many institutions, while others are phased out, producing more reliable and trustworthy outputs.

The research team is part of the AI-driven data and model sharing track topic under the 2020 cohort NSF Convergence Accelerator program, designed to leverage a convergence approach to transition basic research and discovery into practice. NSF is investing more than $27 million to support the teams in phase one to develop the solution groundwork for AI-Driven Data and Model Sharing.

The Convergent Accelerators AI-Driven Innovation via Data and Model Sharing topic involves 18 teams concentrating on solution development. These research teams will also address a variety of data and model-related challenges and data types to include platform development to enable easy and efficient data matching and sharing.

The quantum technology and AI-driven data and model sharing topics were chosen based on community input and identified federal research and development priorities, said Douglas Maughan, head of the NSF Convergence Accelerator program. This is the program's second cohort and we are excited for these teams to use convergence research and innovation-centric fundamentals to accelerate solutions that have a positive societal impact.

See the rest here:
Collaboration Will Offer Data to Train Machine Learning Tools - HealthITAnalytics.com

Odias in Machine Learning global virtual conference to be held today – Times of India

BHUBANESWAR: A group of Odias, who are working and showing interest in the fields of artificial intelligence (AI) and machine learning (ML), will organise a global virtual conference on Sunday to promote the use of AI and ML for the development of Odisha and advancement of Odia language in the digital era.The conference, called Odias in ML Conference, also aims to showcase career and entrepreneurship opportunities in AI and ML for Odias across the world, said Anjan Kumar Panda, convenor of the conference. It will be attended by technologists, researchers, academicians, business executives, entrepreneurs, policymakers, linguists, language activists, media persons and community leaders, all with a commitment to AI and machine learning. The conference will have four sessions, themed AI for Odisha, AI for Odia language, research and career opportunities in AI and entrepreneurship and business opportunities in AI. Odisha government's IT secretary Manoj Mishra, president of Odisha Society of America Kuku Das, Mo School chairperson Susmita Bagchi and other noted persons will attend the conference. The virtual conference which will commence at 5pm on October 4 will be available across different social media platforms like YouTube, Facebook and Twitter.

Visit link:
Odias in Machine Learning global virtual conference to be held today - Times of India

Machine Learning as a Service (MLaaS) Market Challenges Report 2020: Comprehensive Insights, Future Forecasts To 2028 | Covid 19 Implications And…

Machine Learning as a Service (MLaaS) Market study offers a thorough assessment of the market by highlighting facts on various features including restraints, drivers, threats, and opportunities. The global market report includes a competitive landscape analysis, market trends, and strategic regional growth status.

Click here to get sample of the premium report: https://www.quincemarketinsights.com/request-sample-50032?utm_source= dc/hp

This study offers an inclusive numerical analysis of the Machine Learning as a Service (MLaaS) industry, along with the statistics for planning and making strategies to augment market growth. The study also evaluates the gross margin, market size, revenue, price, and market share, growth rate, and cost structure in the market for efficient decision making.

COVID-19 Impact Analysis

The report offers a study on the recession, the latest market scenario, and the effect of the COVID-19 on the entire industry. It offers qualitative data on the industry to market players who are reconsidering their goals by assessing the situation and considering possible actions to cope with this crisis.

Companies are facing growing business concerns associated with the coronavirus outbreak, including a growing risk of recession, supply chain disruptions, and a possible drop in consumer spending. However, these circumstances will play out differently across various states and industries. The report will help companies in taking accurate and timely decisions in these hard times.

Market Study Outline:

This research study offers breakthrough inputs and insights on market-related factors like competition, size, trends, forecasts, analysis, etc. The study encompasses secondary and primary data sources along with qualitative and quantitative practices, thus promising data accuracy. The study offers capacity, product specifications, company profiles, 2016-2028 market shares for key vendors, and production value.

Machine Learning as a Service (MLaaS) Market

Insights on the Report:

The Machine Learning as a Service (MLaaS) Market research study offers a complete assessment of the market and contains projections with a suitable set of assumptions, thoughtful insights, historical data, facts, statistically-supported information, industry-validated market data, and methodology. It offers an analysis and data by categories such as regions, market segments, distribution channels, and product type.

Market Breakdown

This research study classifies the global market by brands, players, types, regions, and applications. The market study offers insights on company variables, complex conditions prevailing in the industry including situational factors, and industry features. The market is segmented as By Type (Special Services and Management Services), By Organization Size (SMEs and Large Enterprises), By Application (Marketing & Advertising, Fraud Detection & Risk Analytics, Predictive Maintenance, Augmented Reality, Network Analytics, and Automated Traffic Management), By End User (BFSI, IT & Telecom, Automobile, Healthcare, Defense, Retail, Media & Entertainment, and Communication) The market study provides a detailed analysis by studying individual conditions and circumstances that are facilitating the market growth.

Get ToC for the overview of the premium report https://www.quincemarketinsights.com/request-toc-50032?utm_source=dc/hp

Regional Market:

The Machine Learning as a Service (MLaaS) Market is analyzed and market scope and information is provided by regions (countries). The key regions covered in the Machine Learning as a Service (MLaaS) Market study are North America, Europe, Asia Pacific, Middle East and Africa, South America. It also covers key regions (countries), viz, Canada, U.S., Germany, U.K., France, Italy, China, Russia, Japan, India, South Korea, Australia, Indonesia, Taiwan, Thailand, Philippines, Malaysia, Vietnam, Brazil, Mexico, Turkey, U.A.E, Saudi Arabia, etc.

Competitive Conclusion:

The study offers a complete analysis and precise statistics and revenue figures, by the players, for the period of 2016-2028. It also offers a thorough analysis sustained by reliable statistics on revenue (global and regional level) and by participants for the timeframe of 2016-2028. Details included in the report are a major business, company description, company total revenue, and sales, recent developments, and revenue generated in the Machine Learning as a Service (MLaaS) business. The leading companies covered in this report are Microsoft, IBM Corporation, International Business Machine, Amazon Web Services, Google, Bigml, Fico, Hewlett-Packard Enterprise Development, At&T, Fuzzy.ai, Yottamine Analytics, Ersatz Labs, Inc., and Sift Science Inc.

Speak to analyst before buying this report https://www.quincemarketinsights.com/enquiry-before-buying-50032?utm_source=dc/hp

Market Highlights

The study is an all-inclusive research study of the global Machine Learning as a Service (MLaaS) Market taking into account the recent trends, growth factors, developments, competitive landscape, and opportunities. The market researchers and analysts have done a broad analysis of the global Machine Learning as a Service (MLaaS) Market with the help of research methodologies such as Porters Five Forces analysis and PESTLE.

The study will help the market leaders as well as the new entrants in this market with information on the closest approximations of the revenue numbers for the overall market and the sub-segments. This study will help stakeholders understand the competitive landscape and gain more insights to better position their businesses and plan suitable go-to-market strategies.

About Us

We take pride in serving our present and new customers with information and analysis that match and suit their objective. Our research study can be customized to include clinical trial results data, price trend analysis of target brands for understanding the market for additional countries (ask for the list of countries), literature review, product base analysis, and refurbished market.Contact:

Quince Market Insights

Office No- A109

Pune, Maharashtra 411028

Phone: APAC +91 706 672 4848 / US +1 208 405 2835 / UK +44 1444 39 0986

Email: [emailprotected]

Web: https://www.quincemarketinsights.com

Read more:
Machine Learning as a Service (MLaaS) Market Challenges Report 2020: Comprehensive Insights, Future Forecasts To 2028 | Covid 19 Implications And...

A Machine Learning Tool Supports the Search for New Craters on Mars – Science Times

Planetary scientists and artificial intelligence (AI) researchers have collaborated on a machine learning tool that helped discover new craters on Mars - including small impacts left by a meteor about eight years ago.

Between March 2010 and May 2012, a meteor flew over Mars, burned, and eventually disintegrated into smaller pieces that crashed into the planet's surface. This left unusually small - and relatively easy to miss, at only 4 meters (13 feet) wide - craters. With the help of its AI-driven tool, NASA scientists at the Jet Propulsion Laboratory (JPL) in Pasadena, California are looking forward to reduced lead time and increased findings on the Red Planet's surface.

(Photo: Photo by David McNew/Getty Images)PASADENA, CA - MAY 27: Principal Investigator, HiRise Camera on Mars Reconnaissance Orbiter, Brian Portock talks to reporters in front of an image of a crater taken during the descent of the Phoenix Mars Lander during an update briefing, two days after landing in a northern polar region of Mars, at NASA's Jet Propulsion Laboratory (JPL) on May 27, 2008, in Pasadena, California. The Phoenix Mars Lander is the newest hope in the search for signs of life on Mars. Fewer than half of the Mars missions have made successful landings. At a cost of $420 million, the Phoenix Mars Lander has flown 422 million miles since leaving Earth in August 2007.

Usually, NASA scientists have to manually analyze the images taken by the Mars Reconnaissance Orbiter (MRO) in search of uncommon phenomena in the Red Planet's surface - avalanche, shifting sand dunes, dust devils, and more. Throughout the MRO's 14-year service, it has provided data that allowed the space agency to find more than 1,000 craters. Most of these discoveries begin with the Context Camera installed in the orbiter, taking extremely large yet low-resolution images of the planet's surface, covering hundreds of miles per shot.

RELATED: Elon Musk on Mars Colonization: "Good Chance You'll Die"

Craters are detected through their blast marks, making them visible from the low-res images. However, the craters themselves remain virtually invisible, which leads to the next process. Using the High-Resolution Imaging Experiment (HiRISE). It provides clearer, more detailed pictures of the target. In fact, its vision system can detect even the tracks left behind by the Curiosity rover. Additionally, the research team allows the public to put in their specific request through the HiRISE HiWish website.

This next process, according to a NASA press release, takes around 40 minutes for a researcher to go through a single Context Camera image. To cut the time required, the JPL team created a machine learning tool called the Automated Fresh Impact Crater Classifier. The AI tool is a part of a wider effort among Jet Propulsion Laboratory scientists called COSMIC - for Capturing Onboard Summarization to Monitor Image Change - that aims to continuously improve Mars orbiters.

JPL researchers trained the crater classifier by providing it with a total of 6,830 Context Camera images, including locations that contained impacts already identified and confirmed by HiRISE. The images provided to the machine learning tool also included images with no impacts, to also train the tool to identify what not to look for.

After the training phase, the crater classifier was deployed on Context Camera's repository of more than 100,000 pictures. A process that used to take 40 minutes is now accomplished on an average of 5 seconds, thanks to a set of high-performance computers operating in parallel within JPL's supercomputer cluster.

"It wouldn't be possible to process over 112,000 images in a reasonable amount of time without distributing the work across many computers," explained Gary Doran, a computer scientist at JPL. The team was challenged at first with running 750 copies of the classifier across the entire cluster.

RELATED: Deep Learning Model Outperforms NPC, Player Records in Gran Turismo

However, a human operator still checks the data returned by the AI tool. Kiri Wagstaff, also a JPL computer scientist, explained that AI tools still can't do the "skilled analysis" that a scientist can do.

Check out more news and information on Mars in Science Times.

Originally posted here:
A Machine Learning Tool Supports the Search for New Craters on Mars - Science Times