Using Machine Learning To Predict Disease In Cattle Might Help Solve A Billion-Dollar Problem – Forbes

One of the challenges in scaling up meat production are issues of disease for the animals. Take bovine respiratory disease (BRD), for example. This contagious infection is responsible for nearly half of all feedlot deaths for cattle every year in North America. The industrys costs for managing the disease come close to $1 billion annually.

Preventative measures could significantly decrease these costs, and a small team comprising a data scientist, a college student and two entrepreneurs spent the past weekend at the Forbes Under 30 Agtech+ Hackathon figuring out a concept for better managing the disease.

Their solution? Tag-Ag, a conceptual set of predictive models that could take data already routinely gathered by cattle ranchers and tracked using ear tags to both identify cows at risk for BRD to focus prevention efforts; and to trace outbreaks of BRD to provide more focused treatment and management decisions.

By providing these insights, we can instill confidence in both big consumers such as McDonalds or Wal-Mart, and small consumers like you and me, that their meat is sourced from a healthy and sustainable operation, said team member Natalie McCaffrey, an 18 year-old undergraduate at Washington & Lee University at the Hackathons final presentations on Sunday evening.

McCaffrey was joined by Jacob Shields, 30, a senior research scientist at Elanco Animal Health; Marya Dzmiturk, 28, cofounder of TK startup Avanii and an alumnus of the 2020 Forbes Under 30 list in Manufacturing & Industry; and Shaina Steward, 29, founder of The Model Knowledge Group & Ekal Living.

They joined a larger group of hackathoners who brainstormed a variety of concepts related to animal health on Friday night before settling on three different ideas, at which point the group split into the smaller teams. The initial pitch for the Tag-Ag team was the use of AI & Big Data to help producers keep animals healthy.

As the Tag-Ag team began its research and development process on Saturday, one clear challenge was the scope of potential animal health issues, as well as a potentially intense labor process in collecting useful information. They settled on cattle because, McCaffrey says, big ranchers are already electronically collecting data on cattle, and because BRD by itself makes a huge impact on the industry.

Another advantage of using data already being collected, adds Shields, is that tools exist to build a model for the concepts predictive analytics based on whats out there. For supervised machine learning algorithms, the more inputs the better, he says. I dont believe well need additional studies to support this case, unless we knew of a handful of data points that werent being collected that really would help with the predictability.

For a business model, the Tag-Ag team suggests a subscription-based model, with a one-time implementation fee for any hardware needs. They believe that theres definitely room to raise capital, pointing to the size of the market loss theyre addressing plus the $500 million in venture capital invested in AgTech companies in 2019 alone.

Investors and institutions are recognizing opportunities in the AgTech space, McCaffrey says, and beyond that, she adds, our space of AI and data has space for additional players.

Team members: Natalie McCaffery, undergraduate, Washington & Lee University; Jacob Shields, senior research scientist, Elanco Animal Health; Marya Dzmiturk, cofounder, Avanii; Shaina Steward, 29, founder, The Model Knowledge Group and Ekal Living.

The rest is here:
Using Machine Learning To Predict Disease In Cattle Might Help Solve A Billion-Dollar Problem - Forbes

Is Quantum Machine Learning the next thing? | by Alessandro Crimi | ILLUMINATION-Curated | Oct, 2020 – Medium

In classical computers, bits are stored as either a 0 or a 1 in binary notation. Quantum computers use quantum bits or qubits which can be both 0 and 1, this is called superimposition. Last year Google and NASA claimed to have achieved quantum supremacy, raising some controversies though. Quantum supremacy means that a quantum computer can perform a single calculation that no conventional computer, even the biggest supercomputer can perform in a reasonable amount of time. Indeed, according to Google, the Sycamore is a computer with a 54-qubit processor, which is can perform fast computations.

Machines like Sycamore can speed up simulation of quantum mechanical systems, drug design, the creation of new materials through molecular and atomic maps, the Deutsch Oracle problem and machine learning.

When data points are projected in high dimensions during machine learning tasks, it is hard for classical computers to deal with such large computations (no matter the TensorFlow optimizations and so on). Even if the classical computer can handle it, an extensive amount of computational time is necessary.

In other words, the current computers we use can be sometime slow while doing certain machine learning application compared to quantum systems.

Indeed, superposition and entanglement can come in hand to train properly support vector machine or neural networks to behave similarly to a quantum system.

How we do this in practice can be summarized as

In practice, quantum computers can be used and trained like neural networks, or better neural networks comprises some aspects of quantum physics. More specifically, in photonic hardware, a trained circuit of quantum computer can be used to classify the content of images, by encoding the image into the physical state of the device and taking measurements. If it sounds weird, it is because this topic is weird and difficult to digest. Moreover, the story is bigger than just using quantum computers to solve machine learning problems. Quantum circuits are differentiable, and a quantum computer itself can compute the change (rewrite) in control parameters needed to become better at a given task, pushing further the concept of learning.

Read the original:
Is Quantum Machine Learning the next thing? | by Alessandro Crimi | ILLUMINATION-Curated | Oct, 2020 - Medium

Machine Learning Is Cheaper But Worse Than Humans at Fund Analysis – Institutional Investor

Morningstar had a problem.

Or rather, its millions of users did: The star-rating system, which drives huge volumes of assets, is inherently backwards-looking. These make-or-break badges label how good (or bad) a fund has performed, not how it will perform.

Morningstars solution was analysts: humans who dig deep into the big and popular fund products, then assign them forward-looking ratings. For analyzing the lesser or niche products, Morningstar unleashed the algorithms.

But the humans still have an edge, academic researchers found except in productivity.

We find that the analyst report, which is usually 4 or 5 pages, provides very detailed information, and is better than a star rating, as it claims to be, said Si Cheng, an assistant finance professor at the Chinese University of Hong Kong, in an interview.

[II Deep Dive: AQRs Problem With Machine Learning: Cats Morph Into Dogs]

The most potent value in all of these Morningstar modes came from the tone of human-generated reports assessed using machine-driven textual analysis Cheng and her co-authors of a just-publishedworking paperfound.

Tone is likely to come from soft information, such as what the analyst picks up from speaking to fund management and investors. That deeply human sense of enthusiasm or pessimism matters when it comes through in conflict with the actual rating, which the analysts and algos based on quantitative factors.

Most of Morningstars users are retail investors, but only professionals are tapping into this human-quant arbitrage, discovered Cheng and her Peking University co-authors Ruichang Lu and Xiajun Zhang.

We do find that only institutional investors are taking advantage of analysts reports, she told Institutional Investor Tuesday. They do withdraw from a fund if the fund gets a gold rating but a pessimistic tone.

Cheng, her coauthors, and other academic researchers working in the same vein highlight cost one major advantage of algorithmic analysis over the old-fashioned kind. After initial set up, they automatically generate all of the analysis at a frequency that a human cannot replicate, Cheng said.

As Anne Tucker, director of the legal analytics and innovation initiative at Georgia State University, cogently put it, machine learning is leveraging components of human judgement at scale. Its not a replacement; its a tool for increasing the scale and the speed. On the legal side, almost all of our data is locked in text: memos, regulatory filings, orders, court decisions, and the like.

Tucker has teamed up with GSU analytics professor Yusen Xia and associate law professor Susan Navarro Smelcer to gather the text of fund filings and turn machine-learning programs onto them, searching for patterns and indicators of future risk and performance. The project is underway, and detailed in a recent working paper.

We have complied all of the investment strategy and risk sections from 2010 onwards, and are using text mining, machine learning, a suite of other computational tools to understand the content, study compliance, and then to aggregate texts in order to model emerging risks, Tucker told II. If we listen to the most sophisticated investors collectively, what can we learn? If we would have had these tools before 2008, would we have been able to pick up tremors?

Maybe but they wouldnt have picked up the Covid-19 crisis, early findings suggest.

There were essentially no pandemic-related risk disclosures before this happened, Tucker said.

Original post:
Machine Learning Is Cheaper But Worse Than Humans at Fund Analysis - Institutional Investor

Bespoken Spirits raises $2.6M in seed funding to combine machine learning and accelerated whiskey aging – TechCrunch

Bespoken Spirits, a Silicon Valley spirits company that has developed a new data-driven process to accelerate the aging of whiskey and create specific flavors, today announced that it has raised a $2.6 million seed funding round. Investors include Clos de la Tech owner T.J. Rodgers and baseballs Derek Jeter.

The company was co-founded by former Bloom Energy, BlueJeans and Mixpanel exec Stu Aaron and another Bloom Energy alumn, Martin Janousek, whose name can be found on a fair number of Bloom Energy patents.

Bespoken isnt the first startup to venture into accelerated aging, a process that tries to minimize the time it takes to age these spirits, which is typically done in wooden barrels. The company argues that its the first to combine that with a machine learning-based approach though what it calls its ACTivation technology.

Rather than putting the spirit in a barrel and passively waiting for nature to take its course, and just rolling the dice and seeing what happens, we instead use our proprietary ACTivation technology with the A, C and T standing for aroma, color and taste to instill the barrel into the spirit, and actively control the process and the chemical reactions in order to deliver premium quality tailored spirits and to be able to do that in just days rather than decades, explained Aaron.

Image Credits: Bespoken Spirits

And while there is surely a lot of skepticism around this technology, especially in a business that typically prides itself on its artisanal approach, the company has won prizes at a number of competitions. The team argues that traditional barrel aging is a wasteful process, where you lose 20% of the product through evaporation, and one that is hard to replicate. And because of how long it takes, it also creates financial challenges for upstarts in this business and it makes it hard to innovate.

As the co-founders told me, there are three pillars to its business: selling its own brand of spirits, maturation-as-a-service for rectifiers and distillers and producing custom private label spirits for retailers, bars and restaurants. At first, the team mostly focused on the latter two and especially its maturation-as-a-service business. Right now, Aaron noted, a lot of craft distilleries are facing financial strains and need to unlock their inventory and get their product to market sooner and maybe at a better quality and hence higher price point than they previously could.

Theres also the existing market of rectifiers, who, at least in the U.S., take existing products and blend them. These, too, are looking for ways to improve their processes and make it more replicable.

Interestingly, a lot of breweries, too, are now sitting on excess or expired beer because of the pandemic. Theyre realizing that rather than paying somebody to dispose of that beer and taking it back, they can actually recycle or upcycle maybe is a better word the beer, by distilling it into whiskey, Aaron said. But unfortunately, when a brewery distills beer into whiskey, its typically not very good whiskey. And thats where we come in. We can take that beer bin, as a lot of people call initial distillation, and we can convert it into a premium-quality whiskey.

Image Credits: Bespoken Spirits

Bespoken is also working with a few grocery chains, for example, to create bespoke whiskeys for their house brands that match the look and flavor of existing brands or that offer completely new experiences.

The way the team does this is by collecting a lot of data throughout its process and then having a tasting panel describe the product for them. Using that data and feeding it into its systems, the company can then replicate the results or tweak them as necessary without having to wait for years for a barrel to mature.

Were collecting all this data and some of the data that were collecting today, we dont even know yet what were going to use it for, Janousek said. Using its proprietary techniques, Bespoken will often create dozens of samples for a new customer and then help them whittle those down.

I often like to describe our company as a cross between 23andme, Nespresso and Impossible Foods, Aaron said. Were like 23andme, because again, were trying to map the customer to preference to the recipe to results. There is this big data, genome mapping kind of a thing. And were like Nespresso because our machine takes spirit and supply pods and produces results, although obviously were industrial scale and theyre not. And its like Impossible Foods, because its totally redefining an age-old antiquated model to be completely different.

The company plans to use the new funding to accelerate its market momentum and build out its technology. Its house brand is currently available for sale in California, Wisconsin and New York.

The companys ability to deliver both quality and variety is what really caught my attention and made me want to invest, said T.J. Rogers. In a short period of time, theyve already produced an incredible range of top-notch spirits, from whiskeys to rum, brandy and tequila all independently validated time and again in blind tastings and prestigious competitions.

Full disclaimer: The company sent me a few samples. Im not enough of a whiskey aficionado to review those, but I did enjoy them (responsibly).

See more here:
Bespoken Spirits raises $2.6M in seed funding to combine machine learning and accelerated whiskey aging - TechCrunch

Q&A: SnapLogic CTO makes the case for investment in machine learning – IT Brief Australia

SnapLogic field chief technology officer Brad Drysdale discusses the roadblocks to a successful machine learning implementation and the ways they can overcome them.

The ML process can seem daunting for many organisations due to its unpredictable and experimental nature.

When IT teams initially start going through the data to deploy ML algorithms, most probably wont know the type of data that is required. Alot of exploration must be done before IT decision-makers have an idea of what data will be useful, and which ML algorithms will work best to solve a particular problem.

Other technical challenges include being able to automate data access. When organisations have formulated clear policies that allow easy access to real-time data, they need to consider how to set up a channel or a pipeline to access the data.

Organisations also need to ensure that there is availability of constant real-time data. ML models should not be trained on a single fixed set of data, so organisations need to set them up so that they can retrain their models to adapt to the changing behaviour of the data and the systems that theyre working with.

Additionally, there is a significant talent shortage. While the number of qualified data scientists is growing, there are only a small number of data scientists that we can produce each year.

There are concerns about access to certain types of data, particularly when you have different groups of employees or other stakeholders coming in at different times to work on projects. So organisations should consider filtering out any potentially sensitive information from the data first so that the rest can be used to deploy ML algorithms.

Another issue to overcome is how to fulfil the demand for data scientists. While its great to see that more data scientists are emerging in the workplace, a lot of time still goes into training them, so supply is still not keeping up with the rising demand.

However, more people who have been trained in other areas, such as senior business analysts and software engineers, are increasingly expanding their knowledge of data science and ML, which can help bridge that gap.

Additionally, organisations will have IT business analysts who have experience with handling databases, even if theyre not programmers, theyre still analytically minded, so they can take advantage of ML through self-training too.

All of these developments are following a positive trend, as tools and platforms are beginning to allow a broader range of users to engage with ML and make it useful for them.

I see two main misconceptions about ML that relate to its complexity and capabilities. First, businesses often think that ML is very complex and requires PhDs to get value out of an implementation.

Many relatively simple ML algorithms can be applied to business data to provide predictions or classifications. On the other hand, there is an unrealistic conception that ML is a panacea to all business problems. The sweet spot is understanding the realistic capabilities of different, well-understood ML algorithms and match them with the right business data to derive real value.

Businesses of all sizes should be working with universities or creating apprenticeship programmes to bring in fresh talent.

For example, the Computer Science department at the University of San Francisco offers a project course for both undergraduates and graduates for one term, where they typically do small project work with an industry sponsor. This not only allows students to work with a variety of different companies, but it also makes the recruitment process for businesses far easier.

Another way to help bridge the skills gap is through investment in technology that will lift the burden off IT professionals.

Low-code/no-code platforms are a prime example of this, as they can enable data tasks to be undertaken by people outside of the IT department working in the lines of business.

Currently, having the right data to deploy ML algorithms is an incredibly time-consuming process. Alot of time is spent trying to get access to and sift through vast volumes of disorganised data with manual coding, leaving IT professionals little time to focus on higher-value tasks.

By investing in the right low-code/no-code technology, businesses can easily automate data pipelines, giving all departments regular access to real-time data, and make ML processes as seamless as possible with little to no coding required.

Businesses can look at how investment in emerging technologies will benefit them in two ways either to get ahead of their competition or to prevent their organisation from becoming obsolete.

Businesses need to follow and even move ahead of technology trends to not only offer a better experience and more effective utilisation of their resources but continue to provide the services that their users and customers expect.

Eventually, all organisations will need to adopt ML simply because that will become an expectation, so that applications and services can better anticipate what their users are attempting to do and to provide recommendations or predictions that enable them to achieve their goals more rapidly.

This doesnt just apply to just investment in the technologies, but the skills training as well. Businesses need to ensure that people are trained well to utilise these technologies, but also continue to help expand their skill set to harness the full potential of these emerging technologies.

View post:
Q&A: SnapLogic CTO makes the case for investment in machine learning - IT Brief Australia

The Convergence of RPA and Automated Machine Learning – AiiA

Add bookmark

The future is now. We've been discussing the fact that RPA truly transforms the costs, accuracy, productivity, speed and efficiency of your enterprise. That transformation is all the more powerful with cognitive solutions baked-in.

Our old friends at Automation Anywhere combine forces with our new friends at DataRobot to discuss the integration and convergence of RPA & Automated ML and how that combination can hurdle your enterprise further through this fourth industrial revolution.

Watch the session on demand now.

The Convergence of RPA and Automated Machine LearningGreg van Rensburg, Director, Solutions Consulting,Automation AnywhereColin Priest, Vice President, AI Strategy,DataRobot

Robotic Process Automation (RPA) has disrupted repetitive business processes across a variety of industries. The combination of RPA, cognitive automation, and analytics is a game changer for unstructured data-processing and for gaining real-time insights. The next frontier? A truly complete, end-to-end process automation with AI-powered decision-making and predictive abilities. Join Automation Anywhere and DataRobot at this session to learn how organisations are using business logic and structured inputs, through a combination of RPA and Automated Machine Learning, to automate business processes, reduce customer churn and transform to digital operating models.

More here:
The Convergence of RPA and Automated Machine Learning - AiiA

Machine Learning Software is Now Doing the Exhausting Task of Counting Craters On Mars – Universe Today

Does the life of an astronomer or planetary scientists seem exciting?

Sitting in an observatory, sipping warm cocoa, with high-tech tools at your disposal as you work diligently, surfing along on the wavefront of human knowledge, surrounded by fine, bright people. Then one dayEureka!all your hard work and the work of your colleagues pays off, and you deliver to humanity a critical piece of knowledge. A chunk of knowledge that settles a scientific debate, or that ties a nice bow on a burgeoning theory, bringing it all together. ConferencestenureNobel Prize?

Well, maybe in your first year of university you might imagine something like that. But science is work. And as we all know, not every minute of ones working life is super-exciting and gratifying.

Sometimes it can be dull and repetitious.

Its probably not anyones dream, when they begin their scientific education, to sit in front of a computer poring over photos of the surface of Mars, counting the craters. But someone has to do it. How else would we all know how many craters there are?

Mars is the subject of intense scientific scrutiny. Telescopes, rovers, and orbiters are all working to unlock the planets secrets. There are a thousand questions concerning Mars, and one part of understanding the complex planet is understanding the frequency of meteorite strikes on its surface.

NASAs Mars Reconnaissance Orbiter (MRO) has been orbiting Mars for 14.5 years now. Along with the rest of its payload, the MRO carries cameras. One of them is called the Context (CTX) Camera. As its name says, it provides context for the other cameras and instruments.

MROs powerhouse camera is called HiRISE (High-Resolution Imaging Science Experiment). While the CTX camera takes wider view images, HiRISE zooms in to take precision images of details on the surface. The pair make a potent team, and HiRISE has treated us to more gorgeous and intriguing pictures of Mars than any other instrument.

But the cameras are kind of dumb in a scientific sense. It takes a human being to go over the images. As a NASA press release tells us, it can take 40 minutes for one researcher to go over a CTX image, hunting for small craters. Over the lifetime of the MRO so far, researchers have found over 1,000 craters this way. Theyre not just looking for craters, theyre interested in any changes on the surface: dust devils, shifting dunes, landslides, and the like.

AI researchers at NASAs Jet Propulsion Laboratory in Southern California have been trying to do something about all the time it takes to find things of interest in all of these images. Theyre developing a machine learning tool to handle some of that workload. On August 26th, 2020, the tool had its first success.

On some date between March 2010 and May 2012, a meteor slammed into Mars thin atmosphere. It broke into several pieces before it struck the surface, creating what looks like nothing more than a black speck in CTX camera images of the area. The new AI tool, called an automated fresh impact crater classifier, found it. Once it did, NASA used HiRISE to confirm it.

That was the classifiers first find, and in the future, NASA expects AI tools to do more of this kind of work, freeing human minds up for more demanding thinking. The crater classifier is part of a broader JPL effort named COSMIC (Capturing Onboard Summarization to Monitor Image Change). The goal is to develop these technologies not only for MRO, but for future orbiters. Not only at Mars, but wherever else orbiters find themselves.

Machine learning tools like the crater classifier have to be trained. For its training, it was fed 6,830 CTX camera images. Among those images were ones containing confirmed craters, and others that contained no craters. That taught the tool what to look for and what not to look for.

Once it was trained, JPL took the systems training wheels off and let it loose on over 110,000 images of the Martian surface. JPL has its own supercomputer, a cluster containing dozens of high-performance machines that can work together. The result? The AI running on that powerful machine took only five seconds to complete a task that takes a human about 40 minutes. But it wasnt easy to do.

It wouldnt be possible to process over 112,000 images in a reasonable amount of time without distributing the work across many computers, said JPL computer scientist Gary Doran, in a press release. The strategy is to split the problem into smaller pieces that can be solved in parallel.

But while the system is powerful, and represents a huge savings of human time, it cant operate without human oversight.

AI cant do the kind of skilled analysis a scientist can, said JPL computer scientist Kiri Wagstaff. But tools like this new algorithm can be their assistants. This paves the way for an exciting symbiosis of human and AI investigators working together to accelerate scientific discovery.

Once the crater finder scores a hit in a CTX camera image, its up to HiRISE to confirm it. That happened on August 26th, 2020. After the crater finder flagged a dark smudge in a CTX camera image of a region named Noctis Fossae, the power of the HiRISE took scientists in for a closer look. That confirmed the presence of not one crater, but a cluster of several resulting from the objects that struck Mars between March 2010 and May 2012.

With that initial success behind them, the team developing the AI has submitted more than 20 other CTX images to HiRISE for verification.

This type of software system cant run on an orbiter, yet. Only an Earth-bound supercomputer can perform this complex task. All of the data from CTX and HiRISE is sent back to Earth, where researchers pore over it, looking for images of interest. But the AI researchers developing this system hope that will change in the future.

The hope is that in the future, AI could prioritize orbital imagery that scientists are more likely to be interested in, said Michael Munje, a Georgia Tech graduate student who worked on the classifier as an intern at JPL.

Theres another important aspect to this development. It shows how older, still-operational spacecraft can be sort of re-energized with modern technological power, and how scientists can wring even more results from them.

Ingrid Daubar is one of the scientists working on the system. She thinks that this new tool will help find more craters that are eluding human eyes. And if it can, itll help build our knowledge of the frequency, shape, and size of meteor strikes on Mars.

There are likely many more impacts that we havent found yet, Daubar said. This advance shows you just how much you can do with veteran missions like MRO using modern analysis techniques.

This new machine learning tool is part of a broader-based NASA/JPL initiative called COSMIC (Content-based On-board Summarization to Monitor Infrequent Change.) That initiative has a motto: Observe much, return best.

The idea behind COSMIC is to create a robust, flexible orbital system for conducting planetary surveys and change monitoring in the Martian environment. Due to bandwidth considerations, many images are never downloaded to Earth. Among other goals, the system will autonomously detect changes in non-monitored areas, and provide relevant, informative descriptions of onboard images to advise downlink prioritization. The AI that finds craters is just one component of the system.

Data management is a huge and growing challenge in science. Other missions like NASAs Kepler planet-hunting spacecraft generated an enormous amount of data. In an effort that parallels what COSMIC is trying to do, scientists are using new methods to comb through all of Keplers data, sometimes finding exoplanets that were missed in the original analysis.

And the upcoming Vera C. Rubin Survey Telescope will be another data-generating monster. In fact, managing all of its data is considered to be the most challenging part of that entire project. Itll generate about 200,000 images per year, or about 1.28 petabytes of raw data. Thats far more data than humans will be able to deal with.

In anticipation of so much data, the people behing the Rubin Telescope developed the the LSSTC Data Science Fellowship Program. Its a two-year program designed for grad school curriculums that will explore topics including statistics, machine learning, information theory, and scalable programming.

Its clear that AI and machine learning will have to play a larger role in space science. In the past, the amount of data returned by space missions was much more manageable. The instruments gathering the data were simpler, the cameras were much lower resolution, and the missions didnt last as long (not counting the Viking missions.)

And though a system designed to find small craters on the surface of Mars might not capture the imagination of most people, its indicative of what the future will hold.

One day, more scientists will be freed from sitting for hours at a time going over images. Theyll be able to delegate some of that work to AI systems like COSMIC and its crater finder.

Well probably all benefit from that.

Like Loading...

Here is the original post:
Machine Learning Software is Now Doing the Exhausting Task of Counting Craters On Mars - Universe Today

Samsung launches online programme to train UAE youth in AI and machine learning – The National

Samsung is rolling out a new course offering an introduction to machine learning and artificial intelligence in the UAE.

The course, which is part of its global Future Academy initiative, will target UAE residents between the ages of 18 and 35 with a background in science, technology, engineering and mathematics and who are interested in pursuing a career that would benefit from knowledge of AI, the South Korean firm said.

The five-week programme will be held online and cover subjects such as statistics, algorithms and programming.

The launch of the Future Academy in the UAE reaffirms our commitment to drive personal and professional development and ensure this transcends across all areas in which we operate, said Jerric Wong, head of corporate marketing at Samsung Gulf Electronics.

In July, Samsung announced a similar partnership with Misk Academy to launch AI courses in Saudi Arabia.

The UAE, a hub for start-ups and venture capital in the the Arab world, is projected to benefit the most in the region from AI adoption. The technology is expected to contribute up to 14 per cent to the countrys gross domestic product equivalent to Dh352.5 billion by 2030, according to a report by consultancy PwC.

In Saudi Arabia, AI is forecast to add 12.4 per cent to GDP.

Held under the theme be ready for tomorrow by learning about it today, the course will be delivered through a blended learning and self-paced format. Participants can access presentations and pre-recorded videos detailing their course materials.

Through the Future Academys specialised curriculum, participants will learn about the tools and applications that feature prominently in AI and machine learning-related workplaces, Samsung said.

The programme promises to be beneficial, providing the perfect platform for determined beginners and learners to build their knowledge in machine learning and establishing a strong understanding of the fundamentals of AI, it added.

Applicants can apply here by October 29.

Updated: October 6, 2020 07:57 PM

Excerpt from:
Samsung launches online programme to train UAE youth in AI and machine learning - The National

Long-term PM 2.5 Exposure and the Clinical Application of Machine Learning for Predicting Incident Atrial Fibrillation – DocWire News

Clinical impact of fine particulate matter (PM2.5) air pollution on incident atrial fibrillation (AF) had not been well studied. We used integrated machine learning (ML) to build several incident AF prediction models that include average hourly measurements of PM2.5for the 432,587 subjects of Korean general population. We compared these incident AF prediction models using c-index, net reclassification improvement index (NRI), and integrated discrimination improvement index (IDI). ML using the boosted ensemble method exhibited a higher c-index (0.845 [0.837-0.853]) than existing traditional regression models using CHA2DS2-VASc (0.654 [0.646-0.661]), CHADS2(0.652 [0.646-0.657]), or HATCH (0.669 [0.661-0.676]) scores (each p < 0.001) for predicting incident AF.

As feature selection algorithms identified PM2.5as a highly important variable, we applied PM2.5for predicting incident AF and constructed scoring systems. The prediction performances significantly increased compared with models without PM2.5(c-indices: boosted ensemble ML, 0.954 [0.949-0.959]; PM-CHA2DS2-VASc, 0.859 [0.848-0.870]; PM-CHADS2, 0.823 [0.810-0.836]; or PM-HATCH score, 0.849 [0.837-0.860]; each interaction, p < 0.001; NRI and IDI were also positive). ML combining readily available clinical variables and PM2.5data was found to predict incident AF better than models without PM2.5or even established risk prediction approaches in the general population exposed to high air pollution levels.

See more here:
Long-term PM 2.5 Exposure and the Clinical Application of Machine Learning for Predicting Incident Atrial Fibrillation - DocWire News

Top Machine Learning Companies in the World – Virtual-Strategy Magazine

Machine learning is a complex field of science that has to do with scientific research and a deep understanding of computer science. Your vendor must have proven experience in this field.

In this post, we have collected 15 top machine learning companies worldwide. Each of them has at least 5 years of experience, has worked on dozens of ML projects, and enjoys high rankings on popular online aggregators. We have carefully studied their portfolios and what ex-clients say about working with them. Contracting a vendor from this list, you can be sure that you receive the highest quality.

Best companies for machine learning

1. Serokell

Serokell is a software development company that focuses on R&D in programming and machine learning. Serokell is the founder of Serokell Labs an interactive laboratory that studies new theories of pure and applied mathematics, academic and practical applications of ML.

Serokell is an experienced, fast-growing company that unites qualified software engineers and scientists from all over the world. Combining scientific research and data-based approach with business thinking, they manage to deliver exceptional products to the market. Serokell has experience working with custom software development in blockchain, fintech, edtech, and other fields.

2. Dogtown Media

Dogtown Media is a software vendor that applies artificial intelligence and machine learning in the field of mobile app development. AI helps them to please their customers with outstanding user experience and help businesses to scale and develop. Using machine learning for mobile apps, they make them smarter, more efficient, and accurate.

Among the clients of Dogtown Media are Google, Youtube, and other IT companies and startups that use machine learning daily.

3. Iflexion

This custom software development company covers every aspect of software engineering including machine learning.

Inflexion has more than 20 years of tech experience. They are proficient at building ML-powered web applications for e-commerce as well as applying artificial intelligence technologies for e-learning, augmented reality, computer vision, and big data analytics. In their portfolio, you can find a dating app with a recommender system, a travel portal, and countless business intelligence projects that prove their expertise in the field.

4. ScienceSoft

ScienceSoft is an experienced provider of top-notch IT services that works across different niches. They have a portfolio full of business-minded projects in data analytics, internet of things, image analysis, and e-commerce.

Working with ScienceSoft, you trust your project in the hands of R&D masters who can take over the software development process. The team makes fast data-driven decisions and delivers high-quality products in reduced time.

5. Increon

If you are looking for an innovative software development company that helps businesses to amplify their net impact to customers and employees, pay attention to Increon.

This machine-learning software vendor works with market leaders in different niches and engineers AI strategies for their business prosperity. Icreon has firsthand, real-world experience building out applications, platforms, and ecosystems that are driven by machine learning and artificial intelligence.

6. Hidden Brains

Hidden Brains is a software development firm that specializes in AI, ML, and IoT. During 17 years of their existence, they have used their profound knowledge of the latest technologies to deliver projects for healthcare, retail, education, fintech, logistics, and more.

Hidden Brains offers a broad set of machine learning and artificial intelligence consulting services, putting the power of machine learning in the hands of every startupper and business owner.

7. Imaginovation

Imaginovation was founded in 2011 and focuses on web design and development. It actively explores all the possibilities of artificial intelligence in their work.

The agencys goal is to boost the business growth of its clients by providing software solutions for recommendation engines, automated speech and text translation, and effectiveness assessment. Most high-profile clients are Nestle and MetLife.

8. Cyber Infrastructure

Cyber Infrastructure is among the leading machine learning companies with more than 100 projects in their portfolio. With their AI solutions, they have impacted a whole variety of industries: from hospitality and retail to fintech and Hitech.

The team specializes in using advanced technologies to develop AI-powered applications for businesses worldwide. Their effort to create outstanding projects has been recognized by Clutch, Good Firms, and AppFutura.

9. InData Labs

InData Labs is a company that delivers a full package of AI-related services including data strategy and AI consulting and AI software development. They have plenty of experience working with the technologies of machine learning, NLP, computer vision, and predictive modeling.

InData Labs analyses its clients capabilities and needs, designs a future product concept, inserts the ML system into any production type, and improves the previously built models.

10. Spire Digital

Spire Digital is one of the most eminent AI development companies in the USA. They have worked on more than 600 cases and have deep expertise in applying AI in the fields of finance, education, logistics, healthcare, and media. Among other tasks, Spire Digital helps with building and integrating AI into security systems and smart home systems.

Over more than 20 years, the company has managed to gain main awards including #1 Software Developer In The World from Clutch.co and Fastest Growing Companies In America from Inc. 5000.

Conclusion

Working with a top developer, you choose high-quality software development and extensive expertise in machine learning. They apply the most cutting-edge technologies in order to help your business expand and grow.

Media ContactCompany Name: SerokellContact Person: Media RelationsEmail: Send EmailPhone: (+372) 699-1531Country: EstoniaWebsite: https://serokell.io/

Link:
Top Machine Learning Companies in the World - Virtual-Strategy Magazine