Eight Los Alamos National Laboratory Projects Win R&D 100 Awards: Oscars Of Invention Honor Best Innovations Of Past Year – Los Alamos Reporter

Los Alamos National Laboratory scientists brought home eight of the coveted R&D 100 awards this year,plusan additional four special-recognition awards. Pictured above, scientist Ramesh Jha was among the winners, with the Smart Microbial Cell Technology project, an ultra-high-throughput screening platform to engineer custom biocatalysts.Photo Courtesy LANL

LANL NEWS

Los Alamos National Laboratory technologies brought in eight R&D 100 Awards and Special Recognition Awards, including a Gold Award for Corporate Social Responsibility, Gold and Silver Awards for Market Disruptor Services, and a Bronze Award for Green Technology, presented ByR&D Worldmagazine.

Congratulations to the brilliant teams behind Los Alamos National Laboratorys 8 R&D 100 awards, said Laboratory Director Thom Mason. These awards reflect the great work that our Laboratory does that can both benefit humanity and advance the frontiers of science.

I congratulate the R&D 100 Award teams. Thesediverseinnovationsexemplifythe Laboratorys tradition of scientific and engineering excellence in support of our national security mission, industrial competitiveness, and the broader technical community, said John Sarrao, Deputy Laboratory Director for Science, Technology, and Engineering. The awards demonstrate the strength of our partnerships with industry, academia, and other national laboratories in developing innovative solutions to challenging problems.

The R&D 100 AwardsThe prestigious Oscars of Invention honor the latest and best innovations and identify the top technology products of the past year. The R&D 100 Awards span industry, academia and government-sponsored research organizations.

Since 1978 Los Alamos has won more than 170 R&D 100 Awards. The Laboratorys discoveries, developments, advancements and inventions make the world a better and safer place, bolster national security and enhance national competitiveness.

See all of the2020 R&D 100 Awards. Read more about theLaboratorys past R&D 100 Awards.

Background about the winners

Amanzi-ATSThe open-source software models complex environmental systems across multiple scales. The innovation includes the most complete suite of surface/subsurface processes, built on a flexible infrastructure that allows users to select physical processes and their coupling interactions without rewriting software. AmanziATS has been used to analyze pristine local watersheds, wildfire impact on watersheds, subsurface contaminant transport at legacy waste sites, the effect of a warming climate on the Arctic tundra, and groundwater in fractured porous media.

Los Alamos led the joint entry with Oak Ridge National Laboratory, Lawrence Berkeley National Laboratory, and Pacific National Laboratory. David Moulton directed the Los Alamos team of Adam Atchley, Charles Abolt, Joe Beisman, Katrina Bennett, Markus Berndt, Quan Bui, Michael Buksas, Neil Carlson, Dylan Harp, Rao Garimella, Vitaliy Gyrya, Eugene Kikinzon, Konstantin Lipnikov, Julien Loiseau, Daniel Livingston, Zhiming Lu, Terry Miller, John Ortiz, Alexis Perry, Lori Pritchett-Sheats, Daniil Svyatsky, Alec Thomas, Svetlana Tokareva plus researchers from the other collaborating national laboratories.

Cluster Integrity, Exception Resolution, and Reclustering Algorithm (CIERRA)CIERRA is the first and only software that assesses massive amounts of data from real-time satellite lightning observation and reprocesses that data to ensure an accurate, comprehensive interpretation of the lightning flashes. This enables observation and analysis of previously unstudied, complex cases of extreme and exceptional lightning and provides situational awareness of lightning hazards.

Michael Peterson teamed with Scott Rudlosky (University of Maryland). CIERRA also received a Gold Special Recognition Award for Market Disruptor-Services, which highlights any service from any category as one that forever changed the R&D industry or a particular vertical within the industry.

Legion: A data-centric programming systemLegion is a supercomputing programming system that boosts application performance and speed by automating task scheduling and data movement. The open-source software employs a novel programming language, Regent, which is compatible with all supercomputing architectures. Legion reduces the effort required to write supercomputing applications, eliminating the bottleneck to next-generation exascale computing and enabling the highest levels of performance. Legion is now being applied in computational science, machine learning, and data-centric computing.

Los Alamos led the joint entry with Stanford University, NVIDIA, Sandia National Laboratories, University of California-Davis, and SLAC National Accelerator Laboratory. Galen Shipman and Patrick McCormick directed the Los Alamos research team of Jonathan Graham, Irina Demeshko, Nirmal Prajapati, and Wei Wu.

Multi-Burn Solid RocketSolid rockets are high thrust, safe, scalable, and can be stored for long periods. However, they traditionally only provide a single burn per motor. The Multi-burn Solid Rocket is a revolutionary system providing multiple independent thrusts from a single solid rocket.This new capability could provide agile maneuverability for even the smallest and lowest cost satellites.The Earths orbital zones are an important natural resource.The Multi-burn Solid Rocket could help protect this resource by enabling satellites to avoid orbital debris and to de-orbit at the end of life.

Nicholas Dallmann, Bryce Tappan, and Mahlon Wilson led the team of Eva Baca, Malakai Coblentz, Kavitha Chintam, Bo Folks, Dave Hemsing, Mitchell Hoffmann, Lee Holguin, Joseph Lichthardt, Alan Novak, Kassidy Shedd, Ian Shelburne, Jacob Valdez.

OrganiCamOrganiCam opens exciting frontiers in space exploration and the search for signs of life beyond the Earth.The compact laser-induced fluorescence imaging camera with Raman spectrometer could identify organic molecules and biosignatures in Martian caves, icy-moons, and asteroid surfaces. OrganiCams robust design for extreme environments, portability, simple operation, and low power requirement build on the Labs 50+ years designing robotic instruments for space applications.Los Alamos led the joint entry with University of Hawaii. Roger Wiens and Patrick Gasda directed the Los Alamos team of Samuel Clegg, Magdalena Dale, Kumkum Ganguly, Steven Love, Anthony Nelson, Adriana Reyes-Newell, Raymond Newell, Logan Ott, Benigno Sandoval, and Heather Quinn. Anupam Misrad led a team of collaborators from the University of Hawaii.

QUIC-FireThe software is the first fast-running, laptop-capable, 3D fire-atmosphere feedback model for complex wildfire and prescribed fire scenarios. It simulates critical influences of 3D vegetation structure, variable winds, interaction between multiple fires, and complex topography at meter-scale spatial resolutions. QUIC-Fire transforms fire and fuel managers ability to assess risk, optimize fuel treatments, and plan prescribed burns to prevent catastrophic wildfires.

Rodman Linn of Los Alamos, Scott Goodrick of the USDA Forest Service, and J. Kevin Hiers of Tall Timbers Research Station led the team of Los Alamos researchers Sara Brambilla, Michael Brown, Alexandra Jonko, Alexander Josephson, Richard Middleton, and David Robinson plus collaborators from the USDA Forest Service.

In addition to the R&D 100 Award, Quick-Fire won a Gold Medal in the Corporate Social Responsibility Category. This award honors organizational efforts to be a greater corporate member of society, from a local to global level.

Smart Microbial Cell Technology Biocatalysts are essential for food production, pharmaceuticals, specialty chemicals, renewable energy, and environmental cleanup. Current methods to find biocatalysts are slow. Smart Microbial Cell Technology scans genetic variations to optimize a single enzyme or microbial cell to generate a product efficiently. It selects rare mutations needed for biocatalyst optimization orders of magnitude faster than current screening methods. A custom sensor reporter gene circuit causes cells to fluoresce when they are making the target product. When coupled to flow cytometry, a million biocatalyst variants can be screened in hours.Ramesh Jha led the team of Taraka Dale and Charlie Strauss.

Smart Microbial Cell Technology also received a Silver Special Recognition Award for Market Disruptor-Services, which highlights any service from any category as one that forever changed the R&D industry or a particular vertical within the industry.

Spectroscopic Detection of Nerve Agents (SEDONA)Current airport detection system cannot scan for the threat of toxic organophosphorus nerve agents and insecticides. The SEDONA portable system screensthrough an unopened bottle using the principles of nuclear magnetic resonance spectroscopy. The system could be deployed and operated with minimal training. SEDONA dramatically reduces the likelihood of a successful nerve agent attack at airports, government buildings, embassies, sporting events, concerts, and political rallies.

Robert F. Williams led the team of Michelle Espy, Jacob Yoder, Derrick Kaseman, Per Magnelind, Algis Urbaitis, Michael Janicke, Ryszard Michalczyk, Jurgen Schmidt, Pulak Nath, and Scarlett Widgeon Paisner.

Additional Recognition The Laboratory also received a Bronze Medal for Green Technology. The awards recognize those innovations that help make our environment greener and our goal towards energy reduction closer.

Oleo-Furan Surfactants Made from Renewable BiomassOleo-furan surfactants (OFS) is a new class of non-toxic, non-irritating cleaning agents (surfactants) for laundry detergent. It is the only surfactant that performs effectively in cold water and in hard water, without additional chemicals that other detergents must use to bind the minerals in hard water. OFS can be produced readily from sustainable, bioderived molecules that dont tax the environment or compete with the food supply chain.

Andrew Sutton of Los Alamos National Laboratory led the Los Alamos team of Cameron Moore and Xiao (Claire) Yang. Christoph Krumm of Sironix Renewables led collaborators from Sironix.Separately, one Laboratory team also received a R&D 100 Finalist Award:

Electrochemical Hydrogen Contamination Detectorprotects zero-emission fuel cells by detecting the presence of impurities in hydrogen fuel. Fuel cells power environmentally clean vehicles, forklifts, drones, and provide auxiliary power.High-purity hydrogen is essential to avoid poisoning fuel cells. Current analysis methods are costly and cannot detect impurities in real-time. The Hydrogen Contamination Detector measures some of the highest impact Department of Energy-identified impurities with a simple, low-cost unit that provides 24/7, point-of-service analysis.

Eric Brosha led the team of Christopher Romero, Mahlon Wilson, Rangachary Mukundan, Cortney Kreller, and Tommy Rockward plus collaborators from H2 Frontier, Inc and Skyre, Inc.

Read more about theLaboratorys past R&D 100 Awards.

AboutLos Alamos National Laboratory

Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is operated by Triad, a public service oriented, national security science organization equally owned by its three founding members: Battelle Memorial Institute (Battelle), The Texas A&M University System (TAMUS), and The Regents of the University of California (UC) for the Department of Energys National Nuclear Security Administration.

Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

Like Loading...

Related

Read the rest here:
Eight Los Alamos National Laboratory Projects Win R&D 100 Awards: Oscars Of Invention Honor Best Innovations Of Past Year - Los Alamos Reporter

The 13 Best Machine Learning Courses and Online Training for 2020 – Solutions Review

The editors at Solutions Review have compiled this list of the best machine learning courses and online training to consider for 2020.

Machine learning involves studying computer algorithms that improve automatically through experience. It is a sub-field of artificial intelligence where machine learning algorithms build models based on sample (or training) data. Once a predictive model is constructed it can be used to make predictions or decisions without being specifically commanded to do so. Machine learning is now a mainstream technology with a wide variety of uses and applications. It is especially prevalent in the fields of business intelligence and data management.

With this in mind, weve compiled this list of the best machine learning courses and online training to consider if youre looking to grow your AI or data science skills for work or play. This is not an exhaustive list, but one that features the best machine learning courses and training from trusted online platforms. We made sure to mention and link to related courses on each platform that may be worth exploring as well. Click Go to training to learn more and register.

Platform: Coursera

Description: This course provides a broad introduction to machine learning, data mining, and statistical pattern recognition. Topics include: (i) Supervised learning (parametric/non-parametric algorithms, support vector machines, kernels, neural networks). (ii) Unsupervised learning (clustering, dimensionality reduction, recommender systems, deep learning). (iii) Best practices in machine learning (bias/variance theory; innovation process in machine learning and AI).

Related paths/tracks: Machine Learning with Python (IBM), Machine Learning Specialization (University of Washington),Mathematics for Machine Learning Specialization (Imperial College London), Machine Learning with TensorFlow on Google Cloud Platform Specialization (Google Cloud)

Platform: DataCamp

Description: In this non-technical course, youll learn everything youve been too afraid to ask about machine learning. Theres no coding required. Hands-on exercises will help you get past the jargon and learn how this exciting technology powers everything from self-driving cars to your personal Amazon shopping suggestions. How does machine learning work, when can you use it, and what is the difference between AI and machine learning? Theyre all covered.

Related paths/tracks: Machine Learning for Business, Machine Learning with Tree-Based Models in Python, Machine Learning with caret in R

Platform: Edureka

Description: Edurekas Machine Learning Certification Training using Python helps you gain expertise in various machine learning algorithms such as regression, clustering, decision trees, random forest, Nave Bayes and Q-Learning. This training exposes you to concepts of statistics, time series and different classes of machine learning algorithms like supervised, unsupervised, and reinforcement algorithms. Throughout the course, youll be solving real-life case studies on media, healthcare, social media, aviation, and HR.

Related paths/tracks:Graphical Models Certification Training, Reinforcement Learning, Natural Language Processing with Python

Platform: edX

Description: Perhaps the most popular data science methodologies come from machine learning. What distinguishes machine learning from other computer guided decision processes is that it builds prediction algorithms using data. Some of the most popular products that use machine learning include the handwriting readers implemented by the postal service, speech recognition, movie recommendation systems, and spam detectors.

Related paths/tracks: Machine Learning for Data Science and Analytics (Columbia), Machine Learning Fundamentals (UC San Diego), Machine Learning with Python: from Linear Models to Deep Learning

Platform: Experfy

Description: As an introduction to machine learning, this course is presented at a level that is readily understood by all individuals interested in machine learning. This course provides a history of machine learning, defines data, and explains what is meant by big data; and classifies data in terms of computer programming. It covers the basic concept of numeral systems and the common numeral systems used by computer hardware to establish programming languages. Providing practical applications of machine learning.

Related paths/tracks: Machine Learning for Predictive Analytics, Feature Engineering for Machine Learning, Supervised Learning: Classification, Supervised Learning: Linear Regression, Unsupervised Learning: Clustering

Platform: Intellipaat

Description: This machine learning course will help you master the skills required to become an expert in this domain. Master skills such as Python, ML algorithms, statistics, supervised and unsupervised learning, etc. to become a successful professional in this popular technology. Intellipaats machine learning certification training comes with 24/7 support, multiple assignments, and project work to help you gain real-world exposure.

Related path/track: Artificial Intelligence Course and Training

Platform: LinkedIn Learning

Description: In this course, we review the definition and types of machine learning: supervised, unsupervised, and reinforcement. Then you can see how to use popular algorithms such as decision trees, clustering, and regression analysis to see patterns in your massive data sets. Finally, you can learn about some of the pitfalls when starting out with machine learning.

Related paths/tracks: Essential Math for Machine Learning: Python Edition, Applied Machine Learning: Algorithms, Applied Machine Learning Foundations

Platform: Mindmajix

Description: Mindmajix Machine Learning Training will help you develop the skills and knowledge required for a career as a Machine Learning Engineer. You will gain in-depth knowledge of all the concepts of machine learning including supervised and unsupervised learning, algorithms, support vector machines, etc.,through real-time industry use cases, and this will help you in clearing the Machine Learning Certification Exam.

Related path/track: Machine Learning with Python Training

Platform: Pluralsight

Description: Have you ever wondered what machine learning is? Thats what this course is designed to teach you. Youll explore the open-source programming language R, learn about training and testing a model as well as using a model. By the time youre done, youll have a clear understanding of exactly what machine learning is all about.

Related paths/tracks: Understanding Machine Learning with Python, Understanding Machine Learning with R, Machine Learning: Executive Briefing, How Machine Learning Works, Deploying Machine Learning Solutions

Platform: Simplilearn

Description: This machine learning online course offers an in-depth overview of machine learning topics including working with real-time data, developing algorithms using supervised and unsupervised learning, regression, classification, and time-series modeling. Learn how to use Python in this machine learning certification training to draw predictions from data.

Platform: Skillshare

Description: If youve got some programming or scripting experience, this course will teach you the techniques used by real data scientists in the tech industry and prepare you for a move into this hot career path. This comprehensive course includes68 lecturesspanning almost9 hours of video, and most topics includehands-on Python code examplesyou can use for reference and for practice.

Related paths/tracks:Demystifying Artificial Intelligence: Understanding Machine Learning, Goal-Driven Artificial Intelligence and Machine Learning

Platform: Udacity

Description: Learn advanced machine learning techniques and algorithms and how to package and deploy your models to a production environment. Gain practical experience using Amazon SageMaker to deploy trained models to a web application and evaluate the performance of your models. A/B test models and learn how to update the models as you gather more data, an important skill in the industry.

Related paths/tracks: Intro to Machine Learning with PyTorch,Intro to Machine Learning with TensorFlow

Platform: Udemy

Description: This course has been designed by two professional data scientists that can share their knowledge andhelp you learn complex theory, algorithms, and coding libraries in a simple way. The course will walk you step-by-step into the world of machine learning. With every tutorial, you will develop new skills and improve your understanding of this challenging yet lucrative sub-field of data science.

Related paths/tracks:Python for Data Science and Machine Learning Bootcamp, Machine Learning, Data Science and Deep Learning with Python,Data Science and Machine Learning Bootcamp with R

Timothy is Solutions Review's Senior Editor. He is a recognized thought leader and influencer in enterprise BI and data analytics. Timothy has been named a top global business journalist by Richtopia. Scoop? First initial, last name at solutionsreview dot com.

Original post:
The 13 Best Machine Learning Courses and Online Training for 2020 - Solutions Review

Nextcloud 20: One private cloud to rule them all – ZDNet

I've long recommended Nextcloud as a wonderful open-source, private Infrastructure-as-a-Service (IaaS) cloud. I run it myself both on a server in my office and on my TMDHosting remote server. Over time, though, Nextcloud has been adding more features. These include built-in video-conferencing and group meeting services. Nextcloud Talk and Software-as-a-Service (SaaS) version of the LibreOffice office suite,Collabora. Now, with Nextcloud 20, other third-party services such asMicrosoft Teams,Slack,Jira, GitHub, Twitter, and dozens of others are being integrated.

As Frank Karlitschek, Nextcloud's CEO and founder, explained, "Nextcloud Hub 20 is a dramatic step for users, bringing the different platforms they use during the day into an integrated experience. This can reduce friction, improve reaction times, avoid context switching and ultimately bring greater productivity to our tens of millions of users across the globe -- all the while protecting data security and digital sovereignty of private and enterprise users."

It adds on to its existing strengths by using an open application programming interface (API), the Open Collaboration Services (OCS). This is a long, established API. Started as part of KDE's open desktop standardization effort in 2009, OCS now handles basic file functionality like file access, sharing, versioning, and commenting. It also covers communications, calendaring, and task management.

Nextcloud is using it to bridge the gap between its own Talk chat service with other communications services, besides the ones mentioned earlier, such as Matrix, IRC, XMPP, and many others. More IM services, such as HipChat and Telegram, are in the works.

It's also using it to glue together Nextcloud's integrated, unified search with ticketing systems such as Jira and Zammad, development platforms like GitHub and Gitlab, e-learning systems like Moodle, forums like Discourse and Reddit, and social media platforms such as Twitter and Mastodon.

It also comes with a new user status. With this, you can not let your co-workers know when you're available, or, if you like, you can let them know what you're working on at any given moment.

You can also search across the entire Nextcloud platform, instead of being application-specific. Yes, that means you can search over all your files and programs' data. You can also extend the search to other apps. So far, Nextcloud's developers have extended it over the Github and Gitlab coding platforms, the Zammad and Jira helpdesk tools, Discourse forums, and the Moodle e-learning system.

Under the hood, Nextcloud Talk 10 brings performance and call reliability improvements. This includes an emoji picker and upload view, better camera and microphone settings, mute abilities for moderators, and more.

Nextcloud Mail has also been improved. It now includes extensive mailbox management features such as threaded email views and other interface updates.

In the Nextcloud back-end, numerous performance and security improvements were made to make the Nextloud interface more responsive. It's Lightweight Directory Access Protocol. (LDAP) back-ends have also been tuned up.

To make the most from all these features still requires an expert administrator. Nextcloud is a do-it-yourself platform in many ways. For example, you'll need to manually activate many of the new functions before users can put them to work.

Still, the basic IaaS and SaaS functionality are already there and ready to run for anyone. The other features may take some time and effort, but in the end, you'll have an open-source private cloud workspace that you can use with popular public services and your own in-house storage and services.

Here is the original post:
Nextcloud 20: One private cloud to rule them all - ZDNet

Will the first VP debate tell us more about the 2020 election than Trump and Biden’s? – News@Northeastern

On Wednesday, Vice President Mike Pence and Democratic nominee Sen. Kamala Harris will face off in a debate that might be more substantial than the presidential matchup the week prior.

Where will the two vice presidential candidates differ in their policies? How will they interact with one another?

Michelle Borkin, an assistant professor in the Khoury College of Computer Sciences at Northeastern. Photo courtesy Michelle Borkin

Michelle Borkin, an assistant professor in the Khoury College of Computer Sciences at Northeastern University, and her doctoral student, Laura South, have a new way to answer these questions and distill the often unwieldy debates into coherent, easy-to-access information: an online tool called DebateVis.

The project was created by an interdisciplinary team of researchers at Northeastern following a two-year-long study of argument structure and text summarizationthe process of creating short, accurate, and fluent summaries of longer text documents.

Rather than repeating soundbites and Twitter hot-takes, DebateVis downloads the entire debate and mines it for relevant information to help voters compare the candidates on a substantive level.

The tools website displays three sections. In the middle, visitors can scroll through a full transcript of the debate. On the left, they can track the number of times each candidate mentions a topic or the name of an opponent. To the right, a timeline shows when the different topics came up for discussion. When readers click on a topic, such as climate change or education, the tool directs them to the relevant sections of the transcript. Audience reactions are also marked.

Borkin and South initially rolled out the website for the 2016 debates. Theyve re-launched the site for this years debates ahead of the Nov. 3 election.

With the elections coming up, we felt it was really important to educate the public and to have tools out there to make information accessible and clear, Borkin says.

Northeastern student Laura South studies computer science. Photo by Matthew Modoono/Northeastern University

During the first presidential debate on Sept. 29, the researchers paid close attention to topics and interactions. Later that evening, using a Python scriptan open-source programming languageSouth created visualizations for DebateVis by picking out the topics from a transcript.

Their findings confirmed what many viewers had witnessed: The much-anticipated face-off between the contenders was largely devoid of substance.

One of the main takeaways I got from looking at our visualization of the debate is just how little debating there was [last] Tuesday, says Borkin. Much of our algorithm lay quiet because so much of the evening did not include substantive conversations and arguments about policy, but rather petty conversations with no substance.

South says that for the casual time-constrained viewer looking for recaps of debates, DebateVis offers a visual alternative to soundbites. Andthe visualizations may offer political junkies broader policy insights about the candidates.

Someone who was able to watch the debate and is active and interested in keeping up with the political debates still might not be able to fully analyze the higher level patterns that are occurring, she says. Its hard to see recurring patterns when youre just sitting and watching it from start to finish.

After the election, the team of researchers (which includes Michail Schwab, Lu Wang, Nick Beauchamp, John Wihbey and Aleszu Bajak) plans to evaluate the kind of impact the website had and examine how much of a difference it made.

Creating an informed citizenry is the researchers ultimate goal, says Borkin. The data visualizations are meant to equip citizens with the tools and knowledge they need to make informed decisions on issues and candidates.

Initially it was how do we visualize the debate? But once we got into thinking about this, it was we want an informed electorate, she says. This is just our little tiny contribution to that.

South says more action needs to be taken to make data visualization accessible and useful for people who historically have been marginalized or kept out of positions of power.

I view visualization as a tool for democratization in some ways, she says. I think that it has the potential to give at least some power back to the people who maybe would be otherwise very distanced from this data that is about their lives and impacting their lives.

For media inquiries, please contact Shannon Nargi at s.nargi@northeastern.edu or 617-373-5718.

Go here to read the rest:
Will the first VP debate tell us more about the 2020 election than Trump and Biden's? - News@Northeastern

Lydia partners with Tink to improve open banking features – TechCrunch

French fintech startup Lydia is going to work with financial API startup Tink for its open banking features in its app. Lydia started as a peer-to-peer payment app and now has 4 million users in Europe.

Lydias vision has evolved to become a financial super app that lets you control your bank accounts and access various financial services. In France, you can connect your Lydia account with your bank account using Budget Insights Budgea API.

Over the coming weeks, Lydia is going to switch over and use Tink for some clients going forward. If you have a bank account that works better with Budget Insight, Lydia will keep using Budget Insight for those accounts. If Tink is a better option, Lydia will use Tink.

Its going to be a progressive rollout and well use the best service depending on our users, Lydia co-founder and CEO Cyril Chiche told me.

Open banking is a broad concept and covers many different things. In Lydias case, were talking about two features in particular account aggregation and payment initiation.

In the app, you can connect your bank accounts and view the most recent transactions. This feature is important if you want to become the go-to financial app on your users home screen.

As for payment initiation, as the name suggests, it lets you start a SEPA bank transfer from a third-party service. For instance, you can transfer money from your bank account to your Lydia wallet directly in the Lydia app. You also can move money between multiple bank accounts from Lydia.

Tink provides a single API that manages all the complexities of the information systems of European banks. An API is a programming interface that lets two different services talk and interact with each other. Tink does the heavy lifting and translates each banking API into a predictable API that you can use for all banks.

Since 2018, banks have to provide some kind of API due to Europes DSP2 regulation. Its been a slow start, as many French banks still dont provide a usable API. But its slowly evolving.

Tinks API supports 15 financial institutions in France, including major banks, N26, Revolut and American Express. And it covers a dozen European markets, which is going to be important if Lydia wants to grab more users outside of its home country.

At first, its not going to add new things to the app. But it will allow us to provide features in a very stable environment and at a European scale, Chiche said.

We want to have the most uniform product across different markets, he added later in the conversation.

When you first install Lydia and want to pay back a friend, you associate your debit card with your Lydia account. The startup charges your card before sending money to your friend.

If open banking APIs become the norm, you could imagine grabbing money from someones bank account directly instead of paying card processing fees. But this sort of feature is nowhere near ready for prime time.

What made us choose card payments is that its a stable system with widespread usage and it works every time. When youre dealing with payments, it has to work every single time, Chiche said.

Lydia isnt changing anything on this front for now. But you could imagine some changes in a few years. We are the beginning of a new system that is not going to be ready within the next 18 months, Chiche said.

Cards also provide many advantages, such as the ability to chargeback a card. And card schemes have been trying new things, such as the ability to transfer money directly from a card to another card. So youre not going to ditch your Mastercard or Visa card anytime soon. But Chiche thinks there will be some competition in Europe between DSP2-ready banks and card schemes. European consumers should see the benefits of increased competition.

In other news, Lydia usage dropped quite drastically during the full lockdown earlier this year. But transaction volume has bounced back since then and reached all-time highs. The company processes 250 million in transactions every month and it is currently adding 5,000 new users every day.

See the rest here:
Lydia partners with Tink to improve open banking features - TechCrunch

Arming smart factories with the right tools to keep threats at bay – The Manufacturer

Cybersecurity is a must for any smart factory set up, by ignoring the issue you could leave your company exposed to threats.

A common problem cybersecurity professionals have when talking about online threats is making the listener care. Things that happen in the cyber world lines of code, complex machinery, virtual systems often dont have the required impact because of their abstraction from everyday reality. The more technical the discussion gets, the more difficult it is to make it relevant.

However, when we talk about smart factories the risks are very real. Malicious code could shut down production lines, sabotage facilities and even put factory floor workers in physical danger. Tackling these challenges will require a new focus on co-operation between IT and OT teams.

Patching problems

In fact, it is IT-OT convergence in the modern manufacturing facility that has precipitated the cybersecurity crisis currently facing the industry. Before the Internet of Things (IoT) started to bring connectivity and computing power to operational technologies, manufacturing systems were arguably more secure. They ran on obscure proprietary software which made attacks often too expensive to research and carry out and, most importantly, were not internet-connected, making it extremely difficult for remote attackers to reach.

That has largely changed as systems were modernised, with internet-connectivity everywhere and many systems running Windows: one of the most popular and most targeted operating systems in the world. Yet while the sector has changed, cybersecurity in many cases has not kept pace.

Take industrial robots: the backbone of the modern factory and the driving force behind Industry 4.0 digital transformation. They typically run on proprietary programming languages, some of which will be out-of-date now and therefore cant be patched. Elsewhere, patches may not be applied because mission critical systems cant be taken offline to test them, or else because of non-security priorities. This is where OT and IT teams clash: OTs priority is usually uptime and safety, while IT will focus more on patching a known vulnerability to mitigate risk going forward.

Robots exposed

Many of the robotic machines on which smart factories rely have extensive access to resources on the shop floorand its often important that they do. File shares, industrial control systems (ICS), human machine interfaces (HMIs) and other elements also have a critical part to play in driving productivity and Industry 4.0 success. However, often theres no access control or isolation between these discrete parts of the manufacturing OT-IT environment. This matters, as it could enable attackers to compromise one part and then move laterally to another. Trend Micros Zero Day Initiative saw a 16% year-on-year increase in the number of disclosed ICS vulnerabilities in the first half of 2020.

Thats not all, sometimes the programming languages used in robots dont have any security validation checking. In fact, secure coding techniques are rarely used, as most of those writing the programmes themselves are process control engineers rather than developers. Code is frequently reused from public repositories rather than designed from scratch, which further exposes manufacturing organisations to cyber-risk. Its claimed that vulnerabilities in open source code libraries have doubled in nearly two years.

Vendor-backed app stores are another potential source of compromise. Until recently these havent applied the same efforts as Apple and Google in validating applications to check if they are malicious. That means malware could find its way onto the platforms, hidden in legitimate-looking software. With this entry point into the smart factory, attackers could carry out a range of theoretical campaigns ranging from theft of sensitive IP and employee/customer data to sabotage of production systems, denial of service and ransomware. Some unauthorised commands may even cause robots to move in such a way as to endanger their operators.

A unified response

Its vital that IT and OT security teams work hard at breaking down the traditional siloes that have kept them apart for so long. There needs to be a unified approach to securing these systems to ensure issues dont fall between the cracks until its too late. Start by conducting an asset inventory and then map data flows and risk scores to it.

Patching is essential for any internet-connected systems. For those on unsupported operating systems or which cant be updated for other reasons, consider virtual patching to keep them protected from threats without the need to take systems offline for testing. Next, its steps like network segmentation to isolate machines that process data from other networks, to minimise the chances of information theft. It goes without saying that protection from a reputable provider, at the endpoint, network and server layers, is vital, as is implementation of a Secure Software Development Lifecycle (SSDLC).

As an industry which employs nearly three million in the UK and accounts for hundreds of billions in exports, manufacturing is very much in the sights of the cybercrime community. As factories become increasingly packed with hi-tech components, its critical that security doesnt remain an afterthought.

Read more:
Arming smart factories with the right tools to keep threats at bay - The Manufacturer

Snyk and PerimeterX Partner to Address Open Source JavaScript Risk Increasingly Common in Web Applications – GlobeNewswire

SAN MATEO, Calif., Oct. 06, 2020 (GLOBE NEWSWIRE) -- PerimeterX, the leading provider of application security solutions that keep digital businesses safe, and developer-first security company Snyk, today announced a technology alliance partnership that solves the growing number of open source vulnerabilities found in web applications, to help businesses minimize exposure to risk and data breaches.

Together, PerimeterX and Snyk provide a complete view of open source risk in web applications to reduce mean time to mitigate and improve collaboration between application security and development teams.

By partnering with the industry leading vendor for open source vulnerabilities, PerimeterX is ensuring that our customers have access to the most accurate and timely information to mitigate web application vulnerabilities in real time. The PerimeterX Code Defender runtime behavioral analysis and mitigation across first-, third- and Nth-party scripts combined with comprehensive and actionable JavaScript vulnerability data from Snyk provides users a quick path to remediation, said Ido Safruti, Co-founder and Chief Technology Officer, PerimeterX.

The need for efficiency and speed in developing web applications is driving increasing adoption of open source and containers. However, in attempting to expedite development by leveraging open source, code reuse and third-party scripts, enterprises face greater potential for risk. The Snyk 2020 State of Open Source Security Report found that the bulk of the open source vulnerabilities discovered are considered to be high severity. Furthermore, according to PerimeterX, as much as 70% of a typical website code is third-party scripts.

PerimeterX Code Defender will integrate with the Snyk Intel Vulnerability Database to give application security teams a complete view of open source vulnerabilities in web applications, shortening mean time to mitigation and reducing the possibility of client-side data breaches and non-compliance.

Snyks new partnership with PerimeterX not only provides an automated, holistic view of vulnerabilities, but it also opens the door to quick, easy fixes and ongoing monitoring, said Geva Solomonovich, CTO of Global Alliances, Snyk. The Snyk database includes the most current, comprehensive, actionable vulnerability data in the market. With developers able to make meaningful security decisions early in development, collaboration and efficiency between application security and development teams soars.

About Snyk IntelWidely adopted because of its timely and accurate data, Snyk Intel combines automated machine learning with expert analysis maintained by a dedicated Snyk research team. In addition to PerimeterX, Red Hat, Docker, Google Chrome Lighthouse and the Linux Foundation embed Snyk Intel vulnerability data into their products to identify critical vulnerabilities in open source dependencies and container images.

About PerimeterX Code DefenderPerimeterX Code Defender is a client-side application security solution that continuously protects websites from digital skimming, formjacking and Magecart attacks, stopping data breaches and reducing the risk of non-compliance. It uses behavioral analysis and advanced machine learning to automatically detect vulnerable Shadow Code scripts, suspicious PII access and data leakage from users browsers. With Code Defender, businesses can reduce the risk of data breaches and compliance penalties while improving operational efficiency.

About SnykSnyk is a developer-first security company that helps software-driven businesses develop fast and stay secure. Snyk is the only solution that seamlessly and proactively finds and fixes vulnerabilities and license violations in open source dependencies and container images. Snyk's solution is built on a comprehensive, proprietary vulnerability database, maintained by an expert security research team in Israel and London. With tight integration into existing developer workflows, source control (including GitHub, Bitbucket, GitLab), and CI/CD pipelines, Snyk enables efficient security workflows and reduces mean-time-to-fix. For more information or to get started with Snyk for free today, visit https://snyk.io.

About PerimeterXPerimeterX is the leading provider of application security solutions that keep digital businesses safe. Delivered as a service, the companys Bot Defender, Code Defender and Page Defender solutions detect risks to your web applications and proactively manage them, freeing you to focus on growth and innovation. The worlds largest and most reputable websites and mobile applications count on PerimeterX to safeguard their consumers digital experience. PerimeterX is headquartered in San Mateo, California and at http://www.perimeterx.com.

More:

Snyk and PerimeterX Partner to Address Open Source JavaScript Risk Increasingly Common in Web Applications - GlobeNewswire

GitHub: Now our built-in bug checker gets these third-party code-scanning tools – ZDNet

GitHub has released a host of third-party security tools for its just-launched code-scanning feature, which helps open-source projects nix security bugs before they hit production code.

GitHub Code Scanning works on top of CodeQL (Query Language), a technology that GitHub integrated into its platform after itacquired code-analysis platform Semmle in September 2019. GitHub announced general availability of code scanning last week after a beta phase that's run since May.

GitHub has now introduced 10 new third-party code-scanning tools that are available with GitHub code scanning to allow developers to remove flaws before they're committed to code.

The ability to add third-party tools to the native GitHub code-scanning feature lets developers customize it for different teams in an organization.

Extensibility is enabled via code scanning's application protocol interface endpoint, which ingests the results of scans from third-party tools using the Static Analysis Results Interchange Format (SARIF).

GitHub sees it being valuable for organizations post-merger with teams running different code-scanning tools, as well as for extending coverage to mobile, Salesforce development or mainframe development. It also enables customized reporting and dashboards.

The new third-party scanning tools include extensions for static analysis and developer security training.

The current roster includes Checkmarx, Codacy, CodeScan, DefenseCode ThunderScan, Fortify on Demand, Muse, Secure Code Warrior, Synopsys Intelligent Security Scan, Veracode Static Analysis, and Xanitizer.

Developers can begin using third-party scanning tools with GitHub Actions, a feature that allows users to automate development workflows, or a GitHub App based on an event, such as a pull request.

GitHub then handles the rest of the task, ensuring there are no duplicates and that alerts are aggregated and associated with each tool that generates a report.

"The results are formatted as SARIF and uploaded to the GitHub Security Alerts tab. Alerts are then aggregated per tool and GitHub is able to track and suppress duplicate alerts,"explains Jose Palafox of GitHub.

"This allows developers to use their tool of choice for any of their projects on GitHub, all within the native GitHub experience."

The third-party scanners are available on GitHub's marketplace.

During the beta, GitHub says code scanning was used to perform more than 1.4 million scans on more than 12,000 repositories. It's helped identify over 20,000 vulnerabilities.

Visit link:

GitHub: Now our built-in bug checker gets these third-party code-scanning tools - ZDNet

Q&A: Experts Weigh in on the Hidden World of Shadow Code – Security Boulevard

Earlier this month, PerimeterX co-hosted a Tweet Chat with IT Security Guru on the topic of Shadow Code and invited a variety of industry experts including analysts, influencers and executives to weigh in on this little-known threat. The conversation lasted for an hour and delved into the issue from the perspective of DevOps, IT security, e-commerce and beyond. Participants included the following individuals:

Carlos: I think of #ShadowCode as the generally overlooked and often unknown third-party or nested service provider code that is incorporated into your e-commerce websites without the knowledge of the security team or awareness of its impacts on security, latency or compliance.

Jamie: #ShadowCode is the use of third-party scripts and libraries in a web application. 80% of code used in applications today originates outside an organization. External code, called open-source, provides accelerated value delivery, it also represents a risk to the organization.

Quentyn: #ShadowCode is code thats been cut and pasted from other third-party locations and may not have been vetted to the same degree as own written code. It doesnt mean its inherently insecure though.

Ameet: Application development today makes extensive use of third-party scripts and open source libraries, which are great for innovation and agility, but the end result is you dont really know what code is running (Read more...)

See the rest here:

Q&A: Experts Weigh in on the Hidden World of Shadow Code - Security Boulevard

Leaving Cert grading meltdown shows why open source is top of the class – Siliconrepublic.com

The Leaving Cert grading catastrophe is a great argument for open-source approaches to code for public use, writes Elaine Burke.

Investor and Irish tech veteran Brian Caulfield neatly summarised what many know to be true when it comes to delivering on a project. You can optimise for two of time, cost and quality. Never all three, he told Adrian Weckler in the Irish Independent, deftly explaining what went wrong with the Leaving Cert grading.

The Government was under time pressure to deliver a fair and effective solution for Leaving Cert grading in the absence of exams this year due to Covid-19. Now, the results are in. Perhaps not an abject failure but certainly Minister for Education Norma Foley, TD, and her department didnt meet the grade.

Like the students who only start to cram in the studying after the mocks and mid-term break, the Government may have waited too long to take the task ahead seriously. There may have been a wait and see tactic in place, keeping an eye on how our neighbours in the UK got on with the A Levels. And when that descended into chaos, there was no tried and tested path to follow. The Irish Government was on its own to come up with something that would work for students nationwide.

But Government officials didnt have to work in isolation. In fact, that was the choice that led to their undoing. They should have considered group study, which might have helped them improve their results.

When it comes to projects of such significance to the public, open source should be the default

Thankfully, after the A Levels disaster, the Irish Government opted for a system that would potentially put greater pressure on university placements but cause less heartache for students. That is, the Leaving Cert grading code would likely result in grade inflation compared to standard testing years, but that was the hit worth taking.

Unfortunately, that wasnt the only problematic outcome. A retrospective analysis of the code already deployed found a number of errors affecting the entirety of 2020s Leaving Cert grading. To be fair to students who were already through two rounds of university placement offers and awaiting the third at this point, none would be downgraded. However, more than 6,000 would be upgraded, further adding to the pressure on institutions to make accommodations.

Going back to Caulfields point, you can see which of the three factors the Government lost marks on. The quality of the Leaving Cert grading system was sacrificed by the time restrictions. Clearly, the analysis that has turned up the errors should have been done beforehand, but the country was on a national deadline.

But theres another line of advice for project delivery the Government should have kept in mind: many hands make light work. If the Government hadnt been so secretive about the Leaving Cert grading system it was working on, all sorts of experts would have been able to review the code and spot errors the assigned team simply didnt have time to.

In fact, when it comes to projects of such significance to the public, open source should be the default. Not going open source with the Leaving Cert grading code was a missed opportunity for the Government to get support from the tech community at large for a project of national importance. Deciding to develop behind closed doors has helped no one and led to failure.

And Foleys department has no excuse in ignorance, as we have seen how coding for the public can be greatly successful thanks to open-source software. The Covid Tracker Ireland app was developed in the open and so many concerns about tracing, privacy and efficacy were addressed in good time. The result of this open collaboration was an app that was largely accepted by groups not often in agreement on such things.

Now, under the project name Covid Green, the source code of the Irish Covid-19 contact-tracing app has been made available for other public health authorities and their developers across the world to use and customise. The Waterford-based company behind it, NearForm, manages the source code repository on GitHub and this has been used to roll out apps in a number of countries and regions so far.

This was an incredible success of open-source software development for the public good, with an Irish companys work being held up as something of a gold standard around the world. And for some reason the Irish Government ignored that signal and did what public bodies tend to do with their decision-making: shroud it in secrecy and hope for the best.

Lets just hope the Leaving Cert grading disaster will be a lesson learned and that the Government can see the clear benefits for open-source development of public projects in the future. In fact, they can even join us at Future Human to learn more about open source from none other than NearForm founder Cian Maidn.

Want stories like this and more direct to your inbox? Sign up for Tech Trends, Silicon Republics weekly digest of need-to-know tech news.

Future Human is Silicon Republics international sci-tech event focusing on the future of work, climate change, AI, security, robotics and life sciences. On 29 and 30 October 2020, it will take place as the first major hybrid tech event of its kind in the world. General, Executive and Student tickets are available now.

Continue reading here:

Leaving Cert grading meltdown shows why open source is top of the class - Siliconrepublic.com