Hospices Leverage Machine Learning to Improve Care – Hospice News

Hospice providers are using new machine learning tools to identify patients in need of their services earlier in the course of their illnesses and to ensure that patients receive appropriate levels of home visits in their final days.

While hospice utilization is rising, lengths of stay for many patients remains too short for them to receive the full benefit of hospice care. Hospice utilization among Medicare decedents exceeded 50% for the first time in 2018, according to the National Hospice & Palliative Care Organization (NHPCO). More than 27% of patients in 2017, however, were in hospice for seven days or less, with another 12.7% in hospice for less than 14 days, NHPCO reported.

While not a panacea, machine learning systems have the ability to help hospices engage patients earlier in their illness trajectory. Machine learning, a form of artificial intelligence, uses algorithms and statistical models to detect patterns in data and make predictions based on those patterns.

With machine learning you actually begin with the outcome for example, the people who were readmitted to the hospital and those who were not. Then the software itself is able to learn what the rules are, all the differences between the people who did or did not get admitted, said Leonard DAvolio, founder and CEO of the health care performance improvement firm Cyft, and assistant professor at Harvard Medical School. The advantage of the special software being able to learn these patterns, instead of the human telling it the patterns, is that software can consider so many more factors or variables than a human could, and it can do it in microseconds. Its basically checking all of the patterns that have come before and predicting the next step forward.

Machine learning systems have the ability to analyze data from claims, electronic medical records or other sources of information to predict when a patient maybe in need of hospice or palliative care, as well as which patients are at the highest risk of hospitalization, among others.

Minnesota-based St. Croix Hospice a portfolio company of the Chicago-based private equity firm Vistria Group uses a system from the recently launched technology firm Muse Healthcare, which applies a predictive model to hospice clinical data to determine which patients are likely to pass away within the forthcoming seven to 12 days.

Patients and families tend to need more intense levels of service as the patient nears their final moments. For this reason, regulators mandate that hospices collect data on the number of visits their patients receive during the last seven and the last three days of life as a part of quality reporting programs.

The U.S. Centers for Medicare & Medicaid Services (CMS) requires hospice providers to submit data for two measures pertaining to the number of hospice visits a patient receives when death is imminent. The three-day measure assesses the percentage of patients receiving at least one visit from a registered nurse, physician, nurse practitioner, or physician assistant in the last three days of life.

A second seven-day measure assesses the percentage of patients receiving at least two visits from a social worker, chaplain or spiritual counselor, licensed practical nurse, or hospice aide in the last seven days of life.

CMS currently publicly reports hospices performance on the three-day measure on Hospice Compare, but to date has not published results on the seven day measure, citing a need for further testing.

Since adopting machine learning, St. Croix has achieved a rate of 100% compliance with the visits during the final three days of life requirement, according to the companys Chief Medical Officer Andrew Mayo, M.D.

I really view it as a sixth vital sign. It provides our clinical team with additional information that helps them make decisions about care. It doesnt replace the need for human contact and evaluation, Mayo told Hospice News. Quite the contrary, it can trigger increased involvement at a time where patients, their families and caregivers may need increased hospice involvement and guidance.

Research indicates that automatic screening and notification systems makes identification of patient needs more efficient, saving hospice and palliative care teams the time consuming task of reviewing charts, allowing them to reach out to patients before receiving a physician referral or order.

Early identification of patient needs can allow hospices to more frequently apply the Medicare service intensity add-on (SIA), which increases payment to hospices for nursing visits close to the end of life.

CMS introduced SIA in 2016 to allow hospices to bill an additional payment on an hourly basis for registered nurse and social worker visits during the last seven days of a patients life in addition to their standard per diem reimbursement.

There is a top line revenue opportunity there. Along with that, were talking about additional visits in the last seven days. Theres value in terms of the outcomes that are being tracked, and those outcomes affect quality scores for the providers, Bryan Mosher, data scientist for Muse Healthcare, told Hospice News. Theres a longer term value there for them there.

Link:
Hospices Leverage Machine Learning to Improve Care - Hospice News

What is the role of machine learning in industry? – Engineer Live

In 1950, Alan Turing developed the Turing test to answer the question can machines think? Since then, machine learning has gone from being just a concept, to a process relied on by some of the worlds biggest companies. Here Sophie Hand, UK country manager at industrial parts supplier EU Automation, discusses the applications of the different types of machine learning that exist today.

Machine learning is a subset of artificial intelligence (AI) where computers independently learn to do something they were not explicitly programmed to do. They do this by learning from experience leveraging algorithms and discovering patterns and insights from data. This means machines dont need to be programmed to perform exact tasks on a repetitive basis.

Machine learning is rapidly being adopted across several industries according to Research and Markets, the market is predicted to grow to US$8.81 billion by 2022, at a compound annual growth rate of 44.1 per cent. One of the main reasons for its growing use is that businesses are collecting Big Data, from which they need to obtain valuable insights. Machine learning is an efficient way of making sense of this data, for example the data sensors collect on the condition of machines on the factory floor.

As the market develops and grows, new types of machine learning will emerge and allow new applications to be explored. However, many examples of current machine learning applications fall into two categories; supervised learning and unsupervised learning.

A popular type of machine learning is supervised learning, which is typically used in applications where historical data is used to develop training models predict future events, such as fraudulent credit card transactions. This is a form of machine learning which identifies inputs and outputs and trains algorithms using labelled examples. Supervised learning uses methods like classification, regression, prediction and gradient boosting for pattern recognition. It then uses these patterns to predict the values of the labels on the unlabelled data.

This form of machine learning is currently being used in drug discovery and development with applications including target validation, identification of biomarkers and the analysis of digital pathology data in clinical trials. Using machine learning in this way promotes data-driven decision making and can speed up the drug discovery and development process while improving success rates.

Unlike supervised learning, unsupervised learning works with datasets without historical data. Instead, it explores collected data to find a structure and identify patterns. Unsupervised machine learning is now being used in factories for predictive maintenance purposes. Machines can learn the data and algorithms responsible for causing faults in the system and use this information to identify problems before they arise.

Using machine learning in this way leads to a decrease in unplanned downtime as manufacturers are able to order replacement parts from an automation equipment supplier before a breakdown occurs, saving time and money. According to a survey by Deloitte, using machine learning technologies in the manufacturing sector reduces unplanned machine downtime between 15 and 30 per cent, reducing maintenance costs by 30 per cent.

Its no longer just humans that can think for themselves machines, such as Googles Duplex, are now able to pass the Turing test. Manufacturers can make use of machine learning to improve maintenance processes and enable them to make real-time, intelligent decisions based on data.

See the original post:
What is the role of machine learning in industry? - Engineer Live

PROTXX Launches Collaboration with Alberta Healthcare and Machine Learning Innovation Ecosystems – BioSpace

MENLO PARK, Calif. and CALGARY, Alberta, Jan. 14, 2020 /PRNewswire/ -- Silicon Valley-based digital healthcare technology pioneer PROTXX, Inc. today announced a broad collaboration with Alberta-based organizations including Alberta Health Services (AHS), independent clinical healthcare providers, machine learning developers, and three of the province's universities. Pilot testing of the PROTXX precision medicine platform has been launched to quantify corresponding improvements in healthcare service quality, patient outcomes, and provider economics.

The PROTXX precision medicine platform integrates wearable sensor and machine learning innovations to replace bulky and expensive clinical equipment and time-consuming testing procedures for a variety of complex medical conditions. PROTXX sensors are designed to be worn unobtrusively by users of any age and in any clinical, athletic, industrial, or military environment, significantly expanding patient and provider access, enabling continuous and remote patient monitoring, and enhancing the value of telemedicine initiatives.

Multiple stakeholders within the Alberta healthcare ecosystem have identified an exciting range of innovative applications in clinical diagnoses, treatment, and rehabilitation of complex medical conditions in which multiple physiological systems are impaired, including injuries such as concussions, diseases such as Parkinson's disease, and age-related disorders such as stroke. The ease of use and level of quantitative assessment enabled by the PROTXX platform provide opportunities for better healthcare quality, outcomes, and value for large and diverse groups of Albertans.

PROTXX, Inc. subsidiary PROTXX Medical Ltd was recently incorporated in Alberta in order to support product development and pilot deployment initiatives with local customers and R&D partners, and to leverage the province's world-class expertise in both healthcare service delivery and machine learning to expand the company's employee base. Tanya Fir, Alberta Minister of Economic Development commented: "I want to commend PROTXX for choosing Alberta. Our province has a dynamic healthcare innovation ecosystem and being recognized for our talent and expertise is a huge win that will spur future developments across multiple sectors of the economy. These pilots also support the strategic focus our province is taking for economic development through innovation, and I look forward to what's in store for the future."

PROTXX CEO and Founder, John Ralston, added: "Alberta's unique combination of healthcare innovation initiatives, world-class research institutions, and economic diversification strategy presents exciting opportunities for PROTXX to expand the development and commercial deployment of our innovations in wearable devices and machine learning, and to prove out the economic value of more quantified management of medical conditions that affect global patient populations numbering in the billions."

PROTXX was originally introduced into the Alberta health care ecosystem by Connection Silicon Valley and New West Networks, a Calgary and San Francisco based team supporting the efforts of Alberta's Economic Development Department to expand and diversify the province's tech sector. Christy Pierce, Director at New West Networks said: "AHS has been a great partner to work with on this investment opportunity. Their Innovation, Impact and Evidence team was instrumental in providing guidance on AHS' priorities and potential opportunities with the appropriate Strategic Clinical Networks (SCN) and clinics. AHS has also worked with us and Innovate Calgary to introduce PROTXX to collaborative research opportunities at the province's universities, providing exciting research opportunities and exposure to the commercial innovation process for our faculty and students."

About PROTXX, Inc.(http://protxx.com/)PROTXX has developed clinical grade wearable sensors that enable rapid non-invasive classification and quantification of neurological, sensory, and musculoskeletal impairments due to fatigue, injury, and disease. PROTXX has further utilized the company's large proprietary data sets to develop and train machine learning models that can automate analytical tasks such as classifying specific medical conditions based upon unique patterns detected in the sensor data. PROTXX customers and partners in Canada, the U.S., the U.K., and Japan are helping healthcare payers rein in costs, providers improve quality of care, and consumers gain greater access to higher quality care and improved outcomes. PROTXX innovations have been recognized with numerous industry, academic, and government awards from healthcare, medical engineering, wearable technology, data analytics, professional sports, defense, and state and local government organizations.

Media inquiries

PROTXX, Inc.:John Ralston, CEO - email: john.ralston@protxx.com - tel: 650.215.8418.

New West Networks:Christy Pierce, Director - email: christy@newwestnetworwks.ca - tel: 403.988.8156.

View original content to download multimedia:http://www.prnewswire.com/news-releases/protxx-launches-collaboration-with-alberta-healthcare-and-machine-learning-innovation-ecosystems-300985235.html

SOURCE PROTXX, Inc.

See the original post here:
PROTXX Launches Collaboration with Alberta Healthcare and Machine Learning Innovation Ecosystems - BioSpace

The 4 Hottest Trends in Data Science for 2020 – Machine Learning Times – machine learning & data science news – The Predictive Analytics Times

Originally published in Towards Data Science, January 8, 2020

2019 was a big year for all of Data Science.

Companies all over the world across a wide variety of industries have been going through what people are calling a digital transformation. That is, businesses are taking traditional business processes such as hiring, marketing, pricing, and strategy, and using digital technologies to make them 10 times better.

Data Science has become an integral part of those transformations. With Data Science, organizations no longer have to make their important decisions based on hunches, best-guesses, or small surveys. Instead, theyre analyzing large amounts of real data to base their decisions on real, data-driven facts. Thats really what Data Science is all about creating value through data.

This trend of integrating data into the core business processes has grown significantly, with an increase in interest by over four times in the past 5 years according to Google Search Trends. Data is giving companies a sharp advantage over their competitors. With more data and better Data Scientists to use it, companies can acquire information about the market that their competitors might not even know existed. Its become a game of Data or perish.

Google search popularity of Data Science over the past 5 years. Generated by Google Trends.

In todays ever-evolving digital world, staying ahead of the competition requires constant innovation. Patents have gone out of style while Agile methodology and catching new trends quickly is very much in.

Organizations can no longer rely on their rock-solid methods of old. If a new trend like Data Science, Artificial Intelligence, or Blockchain comes along, it needs to be anticipated beforehand and adapted quickly.

The following are the 4 hottest Data Science trends for the year 2020. These are trends which have gathered increasing interest this year and will continue to grow in 2020.

(1) Automated Data Science

Even in todays digital age, Data Science still requires a lot of manual work. Storing data, cleaning data, visualizing and exploring data, and finally, modeling data to get some actual results. That manual work is just begging for automation, and thus has been the rise of automated Data Science and Machine Learning.

Nearly every step of the Data Science pipeline has been or is in the process of becoming automated.

Auto-Data Cleaning has been heavily researched over the past few years. Cleaning big data often takes up most of a Data Scientists expensive time. Both startups and large companies such as IBM offer automation and tooling for data cleaning.

Another large part of Data Science known as feature engineering has undergone significant disruption. Featuretools offers a solution for automatic feature engineering. On top of that, modern Deep Learning techniques such as Convolutional and Recurrent Neural Networks learn their own features without the need for manual feature design.

Perhaps the most significant automation is occurring in the Machine Learning space. Both Data Robot and H2O have established themselves in the industry by offering end-to-end Machine Learning platforms, giving Data Scientists a very easy handle on data management and model building. AutoML, a method for automatic model design and training, has also boomed over 2019 as these automated models surpass the state-of-the-art. Google, in particular, is investing heavily in Cloud AutoML.

In general, companies are investing heavily in building and buying tools and services for automated Data Science. Anything to make the process cheaper and easier. At the same time, this automation also caters to smaller and less technical organizations who can leverage these tools and services to have access to Data Science without building out their own team.

(2) Data Privacy and Security

Privacy and security are always sensitive topics in technology. All companies want to move fast and innovate, but losing the trust of their customers over privacy or security issues can be fatal. So, theyre forced to make it a priority, at least to a bare minimum of not leaking private data.

Data privacy and security has become an incredibly hot topic over the past year as the issues are magnified by enormous public hacks. Just recently on November 22, 2019, an exposed server with no security was discovered on Google Cloud. The server contained the personal information of 1.2 Billion unique people including names, email addresses, phone numbers, and LinkedIn and Facebook profile information. Even the FBI came in to investigate. Its one of the largest data exposures of all time.

To continue reading this article click here.

Follow this link:
The 4 Hottest Trends in Data Science for 2020 - Machine Learning Times - machine learning & data science news - The Predictive Analytics Times

Zinier raises $90 million to embed field service work with AI and machine learning – VentureBeat

Workplace automation tools are expected to see an uptick in adoption in the next few years. One company leading the charge is San Francisco-based Zinier, which was founded in October 2015 by Andrew Wolf and former TripAdvisor market development manager Arka Dhar. A developer of intelligent field service automation, the startup provides a platform intelligent service automation and control, or ISAC aimed at fixing machinery before it breaks and maintaining mission-critical client services.

After raising $30 million across three funding rounds, the first of which closed in January 2016, Zinier is gearing up for a major expansion with fresh capital. Today the startup announced that it has raised $90 million in series C funding nearly quadruple its series B total led by new investor Iconiq Capital, with participation from Tiger Global Management and return investors Accel, Founders Fund, Nokia-backed NGP Capital, France-based Newfund Capital, and Qualcomm Ventures. The round brings Ziniers total raised to over $120 million, and CEO Dhar says it will support the companys customer acquisition strategy and accelerate expansion of its services across telecom, energy, utilities, and beyond.

Specifically, Zinier plans to expand in Asia Pacific, Europe, and Latin America, with a particular focus on Australia and New Zealand. On the R&D side, it will build out its AI technologies at the platform level and partner with global system integrators.

Services that we rely on every day electricity, transportation, communication are getting by on centuries-old infrastructure that requires a major upgrade for the next generation of users, added Dhar. A field service workforce powered by both people and automation is necessary to execute the massive amount of work required to not only maintain [this] critical human infrastructure, but to also prepare for growth. Our team is focused on enabling this transformation across industries through intelligent field service automation.

Ziniers eponymous suite delivers insights, general recommendations, and specific tasks by running operations metrics through proprietary AI and machine learning algorithms. As for ISAC, it triggers preventative actions based on equipment health while anticipating stock transfers, and it scans technicians calendars to help define ongoing and potential problems. A separate component Ziniers AI configurator affords control over the recommendations by enabling users to define the corpora on which the recommender systems are trained and to set the algorithms used.

A complementary workflow builder automates routine tasks with custom workflows, and it lets users build, deploy, and update those workflows to their hearts content without having to write custom code. Theyre also afforded access to the app builder, which supports the building of custom solutions for specific field service problems, as well as the grouping together of key functionalities into a single bundle that can be deployed across teams (subject to roles and permissions).

Coordinators can accept or reject suggestions on the fly the former sets off a series of automated actions. And managers can build custom dashboards from which teams can view insights generated by the analysis of historical trends, and optionally receive proactive alerts configured to address particular pain points, like when a task is at risk of falling behind.

Zinier supports scheduling and dispatching so technicians know their work orders up to months in advance, and it autonomously determines general capacity based on transit time, technician availability, and task prioritization. Elsewhere, Zinier uses AI to predict systems failure by weighing real-time internet of things data against historical trends, and it tracks and manages inventory to ensure technicians consistently have the parts they need.

On the mobile side of the equation, Ziniers app makes an effort to boost productivity and job satisfaction by ensuring accurate job completion in part by surfacing contextual and back-office data for technicians in the field. Service workers can access not only relevant documents, but detailed site records and readings from internet of things gadgets.

As my colleague Paul Sawers notes, field service organizations are embracing AI and machine learning at an accelerated pace and startups are rising to meet the demand. Theres the well-fundedServiceMax, which wasacquired by GE Digital a few years back for $915 million and in turn sold a majority stake to Silver Lake in December 2018. Salesforce also offersa product calledField Service Lightning, while Microsoftsnapped up FieldOne Systems in 2015 and now offers a field service management platform called Dynamics 365 for Field Service. For its part, Oraclein 2014 acquired the reportedly lucrative TOA Technologies, and in 2018 SAP bought AI-powered field service management company Coresystems.

Ziniers ostensible advantage is that its AI-powered platform was developed in-house from the ground up and it isnt distracted by other business interests. For Zinier, field service is our sole focus, Dhar told VentureBeat in an earlier interview. The platform has allowed us to quickly build Field Service Elements, our end-to-end field service automation product. And as we continue to build our expertise in different industries, we can quickly configure specific use case solutions for customers.

See the rest here:
Zinier raises $90 million to embed field service work with AI and machine learning - VentureBeat

Machine learning market to reach $96.7 billion by 2025 – ICLG.com

The global machine learning market is projected to have a compound annual growth rate of 43.8% over the n...

The global machine learning market is projected to have a compound annual growth rate of 43.8% over the next five years, reaching $96.7 billion by 2025, according to a report by Grand View Research.

The report shows that larger enterprises accounted for the leading market share in 2018, when the market was valued at $6.9 billion.

The major large enterprises employing deep learning, machine learning and optimisation of decisions to deliver greater business value, are identified as Amazon Web Services, Baidu, Google, Hewlett Packard Enterprise Development, Intel and Microsoft, among others.

Small to medium sized enterprises (SMEs) are not left behind, as they too are benefitting from deployment options which allow them to scale up easily, allowing them to avoid substantial up-front investments.

In addition, the development of customised silicon chips with AI and machine learning competences is increasing companies adoption of hardware, while at the same time, improved processing tools supplied by companies such as computing start-up SambaNova Systems, are expected to grow the market.

The advertising and media sector, which often uses buyers optimisation, data processing and connected AI analysis to predict customer behaviour and improve advertising campaigns, accounted for the largest market share in 2018 owing to AI capabilities.

However, the report explains that the healthcare sector is expected to surpass this segment to account for the largest share by the end of the forecast period.

Elsewhere, the adoption of AI extends to the United States Army, which has future plans to utilise AI for predictive maintenance in combat vehicles, while the stock market is adopting AI and machine learning technology with an accuracy level of about 60% in market predictions.

Examples of technology-based partnerships include H20.ai, an open source machine learning and AI platform, which announced a partnership with IBM Corporation in June 2018.

In the media sector, ICLG.com, a brand of legal publishing and media company Global Legal Group, launched an AI-powered search engine using LexSnap technology, in November 2019.

In addition, professional services firm Ernst & Young released its third edition blockchain technology on Ethereum public platform, announced this week.

See more here:
Machine learning market to reach $96.7 billion by 2025 - ICLG.com

Raytheon Developing Machine Learning that will Communicate what it Learned – Financialbuzz.com

Raytheon Company (NYSE: RTN) announced that it is developing a machine learning technology under a$6 millioncontract from the Defense Advanced Research Projects Agency for the Competency Aware Machine Learning program. According to the defense contractor and technology company, Systems will be able to communicate the abilities they have learned, the conditions under which the abilities were learned, the strategies they recommend and the situations for which those strategies can be used.

Ilana Heintz, principal investigator for CAML at Raytheon BBN Technologies explained that, The CAML system turns tools into partners It will understand the conditions where it makes decisions and communicate the reasons for those decisions.

The machine learning technology will learn from a video game like process. Meaning that instead of giving the system a specific set of rules, the developers will tell the system what choice it has in the game and what the end goal is. By repeatedly playing the game, the system will learn the most effective ways to meet the goal. When successful, the system will then explain itself, by recording the conditions and strategies it used to come up with successful outcomes.

People need to understand an autonomous systems skills and limitations to trust it with critical decisions, added Heintz.

Raytheon also reported that when the system has developed these skills, the team will apply it to a simulated search and rescue mission. Users will create the conditions of the mission, and in the meanwhile the system will make recommendations and give users information about its capability under the specific conditions. For example, the system might say, In the rain, at night, I can distinguish between a person and an inanimate object with 90 percent accuracy, and I have done this over 1,000 times.

Go here to read the rest:
Raytheon Developing Machine Learning that will Communicate what it Learned - Financialbuzz.com

Postdoc Research Assoc./Senior Research Assoc. – Image Processing/Computer Vision & Machine Learning job with LANCASTER UNIVERSITY | 192791 -…

Postdoctoral Research Associate/ Senior Research Associate in Image Processing/ Computer Vision and Machine Learning

In search of uniqueness harnessing anatomical hand variation (H-unique)

School of Computing and Communications

Salary: 28,331- 40,322

Closing Date: 29 February 2020

Interview Date: Mid-March 2020

Contract: Fixed Term

H-unique is a five year, 2.5M ERC-funded programme of research, led by Lancaster University. This will be the first multimodal automated interrogation of visible hand anatomy, through analysis and interpretation of human variation. This exciting research opportunity has arisen directly from the ground-breaking research undertaken by Prof Dame Sue Black in relation to the forensic identification of individuals from images of their anatomy captured in criminal cases.

Assessment of the evidential robustness of hand identification for prosecutorial purposes requires the degree of uniqueness in the human hand to be assessed through large volume image analysis. The research opens up the opportunity to develop new and exciting biometric capabilities with a wide range of real-world applications, from security access to border control whilst assisting the investigation of serious and organised crime on a global level.

This is an interdisciplinary project, supported by anatomists, anthropologists, geneticists, bioinformaticians, image analysts and computer scientists. We are investigating inherent and acquired variation in search of uniqueness, as the hand retains and displays a multiplicity of anatomical variants formed by different aetiologies (genetics, development, environment, accident etc).

The primary aim of the H-Unique project is the successful analysis and interpretation of anatomical variation in images of the human hand. This will be achieved by developing new image processing/computer vision methods to extract key features from human hand images (e.g. vein pattern, skin knuckle creases, tattoos, pigmentation pattern) in a way that is robust to changes in viewpoint, illumination, background, etc. The project will be successful if no two hands can be found to be identical, implying uniqueness. Large datasets are vital for this work to be legally admissible. Through citizen engagement with science, this research will collect images from over 5,000 participants, creating an active ground-truth dataset. It will examine and address the effects of variable image conditions on data extraction and will design algorithms that permit auto-pattern searching across large numbers of stored images of variable quality. This will provide a major novel breakthrough in the study of anatomical variation, with wide ranging, interdisciplinary and transdisciplinary impact.

We are seeking to appoint a Postdoctoral/ Senior Research Associate to work on feature extraction and biometric development. Hard biometrics, such as fingerprints, are well understood and some soft biometrics are gaining traction within both biometric and forensic domains (e.g. superficial vein pattern, skin crease pattern, morphometry, scars, tattoos and pigmentation pattern). A combinatorial approach of soft and hard biometrics has not previously been attempted from images of the hand. We will pioneer the development of new methods that will release the full extent of variation locked within the visible anatomy of the human hand and reconstruct its discriminatory profile as a retro-engineered multimodal biometric. A significant step change is required in the science to both reliably and repeatably extract and compare anatomical information from large numbers of images especially when the hand is not in a standard position or when either the resolution or lighting in the image is not ideal.

We invite applications from enthusiastic individuals who have a PhD or equivalent experience in a relevant discipline such as Computer Science or Electrical Engineering. You must be able to demonstrate a research background in the area of image processing, computer vision, and/or deep learning. Familiarity with image analysis methods, biometrics and machine learning/deep learning frameworks is not essential but will put you at an advantage. We will also value highly your ability to learn rapidly and to adapt to new technologies beyond your current skills and expertise. For more details, please see the Job Description/Person Specification for this position.

This Postdoctoral/ Senior Research Associate position is being offered on a 2-year fixed-term basis. For further information or an informal discussion please contact Dr Bryan Williams (b.williams6@lancaster.ac.uk), Prof Plamen Angelov (Email: p.angelov@lancaster.ac.uk) or Dr Hossein Rahmani (Email: h.rahmani@lancaster.ac.uk).

The School of Computing and Communications offers a highly inclusive and stimulating environment for career development, and you will be exposed to a range of further opportunities over the course of this post. We are committed to family-friendly and flexible working policies on an individual basis, as well as the Athena SWAN Charter, which recognises and celebrates good employment practice undertaken to address gender equality in higher education and research.

Lancaster University - ensuring equality of opportunity and celebrating diversity

Read the rest here:
Postdoc Research Assoc./Senior Research Assoc. - Image Processing/Computer Vision & Machine Learning job with LANCASTER UNIVERSITY | 192791 -...

Manufacturing 2020: 5G, AI, IoT And Cloud-Based Systems Will Take Over – DesignNews

Technology vendors expect that 2020 will be a big year for manufacturing plants to onboard digital systems. But will it happen? While digital systems IoT, machine learning, 5G, cloud-based systems have proven themselves as worthwhile investments, they may not get deployed widely.

For insight on what to expect in 2020, we turned to Rajeev Gollarahalli, chief business officer at 42Q, a cloud-based MES software division of Sanmina. Gollarahalli sees a manufacturing world that will take solid steps toward digitalization in 2020, but those steps are likely to be incremental rather than revolutionary.

5G On The Plant Floor

Design News: Will 5G increase the pace of digital factory transformation, and where it will have the most impact?

Rajeev Gollarahalli: Weve started to see a little of 5G popping up in the factory, but its limited. Its mostly still in the proof-of-concept stage. It will be some time before we see more, probably around the end of next 2020.

DN: Will 5G increase the pace of digital transformation?

Gollarahalli: Undoubtedly. Yet one limit is that in order to make accurate decisions, you need to be able to ingest high volumes of data in real-time. Thats been one of the limitations in infrastructure. When you can use 5G across the factory, youll have considerable infrastructure. That challenge with data is solved by 5G.

DN: What still needs to be done in order to deploy 5G?

Gollarahalli: You have the 5G service providers and 5G equipment manufacturers working together. Both are developing capabilities in their own silos. What has not yet matured is putting these together, whether its in health, discreet manufacturing, telecom, or aerospace. The use cases havent matured, but we are seeing more use cases piling up.

DN: What could spur equipment vendors and telecom to work together?

Gollarahalli: I think well see an industry consortium. That doesnt exist now. There are partners that are starting to talk. Verizon is working with network providers. Youre going to see two or three different groups emerge and come together to do standards. With the advent of 5G, and the emergence of IIoT, they are all going to come together. One of the limitations is the volume. We generate about a terabyte of data with IoT. The timing will be perfect for getting 5G utilized for IoT and get it widely adopted.

The Emerging Workforce Skilled In Digital Systems

DN: What changes in the plant workforce can we expect in the coming year?

Gollarahalli: The workforce will need a completely different set of skills to drive automation on the factory floor, and industry has to learn how to attract those workers People are saying manufacturing is contracting, but Im not seeing it. Manufacturing seems to be stable. As for skills for the factory of the future, we need to be re-tooling our employees. The employees today dont have the technical skills, but they have the domain skills. We need to get them the technical skills they need.

DN: Will the move to a workforce with greater technology skills be disruptive?

Gollarahalli: Youre not going to see mass layoffs, but youre going to see retooling the skills of the employees. We cant get them trained at the speed that technology is increasing. Were going to see more employees getting ready in trade schools and with degrees. What youre seeing is a convergence of data skills with AI and domain skills. An ideal skillset is someone who understands manufacturing and knows the data. For several years kids were moving away from STEM, wanting to learn the sexier stuff. But I think STEM is coming back.

Cloud-Based Systems For Security

DN: Will cloud-based systems be the go-to for manufacturing security versus on-premises security?

Gollarahalli: Five years ago, when I talked about cloud with customers, they asked whether it was real-time. That was when the infrastructure was not as secure. I have a network at home. That was unheard of 10 years ago in factories. Now that the infrastructure issue has been solved, the next step is security. I have always countered that you cant secure data on premises as well as you can in a cloud. A lot of money has poured into cloud-based security. No single company can match that. Its almost impossible to do it on premises.

AI, Machine Learning and Big Data Analytics

DN: Will advances in AI, machine learning, and analytics?

Gollarahalli: Were seeing AI and ML (machine learning) is some areas. Were seeing it implemented in some areas at 42Q. Most use cases are around asset management and quality. Its used to predict the quality of a product and to take preventive actions in asset maintenance. AI and ML are also popping up in supply chain management. 2020 will be the year of AI and ML. Its getting embedded into medical products. Youll see it pop up everywhere, showing up on the factory floor as well as in our consumer products.

DN: Is AI and machine learning going mainstream yet or is it mostly getting deployed by large manufacturers who are typically the early users?

Gollarahalli: Youre going to see it move down the supply chain to tier 2 and tier 3 suppliers. I dont think its just for the elite any more. Its getting adopted quickly, but it is not happening as quickly as I thought it would.

The Role Of IoT In Manufacturing

DN: Will we see growth in IoTs role in measuring and providing closed loop controls?

Gollarahalli: Were going to see it in manufacturing, regulating the humidity in the room or the temperature on the floor. They need closed loop from IoT. Theyre measuring with IoT, but the closed loop as not been adopted as quickly. We dont have the right standards. How do you do close loop with a system that is throwing off data in milliseconds. You must be able to use the IoT and those algorithms. If you can make them more efficient for closed loop control, youll see a lot more of it going forward.

Rob Spiegel has covered automation and control for 19 years, 17 of them for Design News. Other topics he has covered include supply chain technology, alternative energy, and cyber security. For 10 years, he was owner and publisher of the food magazine Chile Pepper.

January 28-30:North America's largest chip, board, and systems event,DesignCon, returns to Silicon Valleyfor its 25th year!The premier educational conference and technology exhibition, this three-day event brings together the brightest minds across the high-speed communications and semiconductor industries, who are looking to engineer the technology of tomorrow. DesignCon is your rocket to the future. Ready to come aboard?Register to attend!

More:
Manufacturing 2020: 5G, AI, IoT And Cloud-Based Systems Will Take Over - DesignNews

Doctor’s Hospital focused on incorporation of AI and machine learning – EyeWitness News

NASSAU, BAHAMAS Doctors Hospital has depriortized its medical tourism program and is now more keenly focused on incorporating artificial intelligence and machine learning in healthcare services.

Dr Charles Diggiss, Doctors Hospital Health System president, revealed the shift during a press conference to promote the 2020 Bahamas Business Outlook conference at Baha Mar next Thursday.

When you look at whats happening around us globally with the advances in technology its no surprise that the way companies leverage data becomes a game changer if they are able to leverage the data using artificial intelligence or machine learning, Diggiss said.

In healthcare, what makes it tremendously exciting for us is we are able to sensorize all of the devices in the healthcare space, get much more information, use that information to tell us a lot more about what we should be doing and considering in your diagnosis.

He continued: How can we get information real time that would influence the way we manage your conditions, how can we have on the backend the assimilation of this information so that the best outcome occurs in our patient care environment.

Diggiss noted while the BISX-listed healthcare provider is still involved in medical tourism, that no longer is a primary focus.

We still have a business line of medical tourism but one of the things we do know pretty quickly in Doctors Hospital is to deprioritize if its apparent that that is not a successful ay to go, he said.

We have looked more at taking our specialities up a notch and investing in the technology support of the specialities with the leadership of some significant Bahamian specialists abroad, inviting them to come back home.

He added: We have depriortized medical tourism even though we still have a fairly robust programme going on at our Blake Road facility featuring two lines, a stem cell line a fecal microbiotic line.

They are both doing quite well but we are not putting a lot of effort into that right now compared to the aforementioned.

Read more:
Doctor's Hospital focused on incorporation of AI and machine learning - EyeWitness News