Manage open source risk in turbulent times through governance & automation – ITProPortal

The phrase corporate change management frequently conjures images of CEOs, executives and leadership teams spearheading responses to market fluctuations, industry shifts, technological disruption and unknown forces like the current coronavirus pandemic. Todays turbulent times require a fresh approach. Modifications to processes organisation-wide can help teams prepare and respond to changing environments, while strengthening an organisations position and ability to withstand challenges.

When any of the events above happen, companies can find themselves with reduced or reorganised resources. When resources are limited, automating processes that were previously done manually can become top priority. As companies take cost cutting measuresreducing staff, streamlining teams, tightening their belts on resourcesit becomes a good time to strengthen infrastructure, technology and back office systems and processes.

One area where this is particularly relevant is in the management of open source software (OSS), including management of the license compliance and security issues that come with its use. Developers leverage OSS in their proprietary applications to speed up time to market and drive innovation; applications often include up to 80 per cent+ of OSS. However, the OSS knowledge gap continues to increase, with an OSS usage disclosure rate of only 6 per cent of what is actually found in audits. This usage can lead to more security vulnerabilities, exploits and breaches, driving an emphasis on regulations from multiple industry groups. Now is not the time to reduce open source usage or disregard open source risk.

Software Composition Analysis (SCA) is the process of identifying and recording the use of all open source software components in your codebase. Open source software can enter your codebase in many forms including software packages, containers, build dependencies, complete source code files, copy/paste fragments of source code files, binaries, images, documents and multimedia files. SCA helps manage IP risks due to non-compliance with legal obligations, security risks due to vulnerable OSS components and reputation risks. Ignoring or deprioritising SCA due to resource constraints might seem prudent in the short-term, but can lead to incompleteness and inaccuracies in your Software Bill of Materials (SBOM), leading to an increase in the aforementioned risks over time.

The need to respond to corporate changes presents an opportunity to invest in engineering process improvements and technical debt burndown, while mitigating the risk that comes with reliance on OSS components. Limited engineering cycles mean that wasting time on remediation isnt wise; anything that can help catch issues early and often becomes a critical task to allow engineering to continue to focus on innovation. Integrating SCA into the DevOps process can help companies gear up for higher customer demand when the economy improves and the pace of engineering further increases. Managing open source risk in a changing environment is all about governance and automation, in tandem.

Governanceincluding oversight and controlof an open source software usage program is more important than ever to protect against malicious attacks in times of disruption or great change. In order to implement any governance program, understanding whats in your software is essential. A SBOM, created through the SCA process, becomes imperative for any effective governance program. The SBOM provides transparency into the makeup of your software applications and records the sub-components and dependencies, along with associated licenses and security vulnerabilities comprising your applications. The SBOM can help inform your approach to open source policy and help you react to vulnerabilities. Due to the level of complexity in todays software applications, without a complete and accurate SBOM, achieving and governing a complete, secure open source ecosystem is impossible.

There is no time like the present to establish an open source governance program and integrate it into the development toolchain in order to comply with open source licenses, manage obligations and maintain up-to-date knowledge of relevant security vulnerabilities impacting your application. Rather than performing ad-hoc scans or implementing scanning toward the end of your application development cycle, start earlier in the DevOps lifecycle, with an automated and cost-effective process. This expand-left approach helps manage compliance and security risks early and often, while protecting your intellectual property (IP). On the other hand, an incomplete, delayed or ignored SCA process may lead to costly litigation or last-minute remediation work.

Moving this scanning earlier in the lifecycleand scanning oftencan help streamline the efforts required to meet regulatory standards. For example, SBOMs are essential to meet PCI (payment account data) standards for secure software; the EUs General Data Protection Regulation (GDPR) requirements for security of processing; the US Food & Drug Administration (FDA) updated cybersecurity recommendations; and the National Telecommunications and Information Administration (NTIA) guidelines for software component transparency. All of these initiatives share the following common requirements to:

The good news is that all of these requirements can be satisfied with continuous SCA. In a nutshell, continuous SCA is an evolution from ad-hoc audits to a manage by exception process. It includes the following key initiatives:

As staff contracts and internal audit teams are often scaled back during times of turbulence and extreme corporate changes, remaining staff gets burdened with more work. That skeleton crew must prioritise efficiency and productivity in order to get through the workload.

Automation in SCA makes up for lost and often forgotten manual efforts. An automated, standardised and repeatable process can help manage open source license inventory with increased accuracy, even at a time when staffing constraints are particularly tight. However, like all complex processes that require human interpretation, some manual processes will always remain. It is therefore important to factor those into your process and help set the right expectation with your stakeholders.

An automated SCA lifecycle includes:

Automation makes open source management an intrinsic part of an engineering process. First and foremost for staff, it means that no one has to push a button or otherwise remember to start an inventory management process at a particular time. Automated alerts inform users of critical changes to their SBOM items, including IP compliance issues and new security vulnerabilities. Tasks are automatically created and assigned to the appropriate users to track remediation work for non-compliant items.

Data currency is also strengthened through automation. Manual methods, including audits, provide snapshots of a codebase at a particular point in time. While useful, these snapshots become outdated quickly and can result in compliance issues being identified too late into the software development life cycle. Furthermore, the volume of data in a snapshot model is overwhelming, whereas an automated, continuous use approach is more manageable. Since your codebase and the resulting SBOM arent static, integrating code scanning into the DevOps environment is a way to ensure that risks are caught and remediated early and often. The cost to remediate a security issue in a released application can be up to 150x that of an issue being identified early in the development process.

Nothing is static. Code isnt static and neither are the processes related to ensuring its security. The only constant is change. Because risk compounds over time, addressing risk early and implementing an effective program for ongoing monitoring of that risk is critical. Ignoring license compliance or deferring security risk management can be more costly in the long run than actualising an open source governance program from the start.

Alex Rybak, Director of Product Management, Revenera

Read this article:
Manage open source risk in turbulent times through governance & automation - ITProPortal

OpenPOWER Reboot New Director, New Silicon Partners, Leveraging Linux Foundation Connections – HPCwire

Earlier this week the OpenPOWER Foundation announced the contribution of IBMs A21 Power processor core design to the open source community. Roughly this time last year, IBM announced open sourcing its Power instruction set (ISA) and Open Coherent Accelerator Processor Interface (OpenCAPI) and Open Memory Interface (OMI). Thats also when IBM said OpenPOWER would become a Linux Foundation entity. Then a few weeks ago, OpenPOWER named a new executive director, James Kulina.

Change is afoot at the OpenPOWER Foundation. Will it be enough to prompt wider (re)consideration and adoption of the OpenPOWER platform and ecosystem?

The big gun here historically has been IBM, which with Google, Tyan, Nvidia, and Mellanox, founded the OpenPOWER Foundation in 2013. It was meant to challenge Intels processor dominance in servers and HPC and to offer an alternative. Not surprisingly, creating a new ecosystem turns out to be hard. For one thing, IBM was the lone Power chip provider sometimes making OpenPOWER seem like just another IBM venture. Thats probably unfair and certainly understates the formidable capabilities of IBMs Power microprocessor (think Summit supercomputer) and the technical strengths of OpenCAPI and OMI (Open Memory Interface).

One has to admire IBMs pluck in dumping virtually all of its x86-based systems businesses for a big bet on IBM Power microprocessors and OpenPOWER generally.

That was then. This is now. For many reasons lets just lump them together as insufficient market traction IBM now looks to be deemphasizing its bet on OpenPOWER and passing the torch or at least a big chunk of the torch to others. A strong piece of evidence is that a successor to the Power9 processor (Power10) seems long overdue. Indeed, Kulina has written in a blog introducing himself, Its my goal to make OpenPOWER one of the easiest platforms to go from an idea to a silicon chip.

That means getting more Power chips from more silicon providers as has sometimes been discussed if not aggressively pursued in the past. The sudden emergence of processor diversity (AMD, ARM, RISC-V) and slight loosening of Intels grip on the market suggests the window for alternatives is open. Performance and price will be critical.

The question is: Can what often seemed like an IBM-plus-a-few-friends club transform into a thriving ecosystem that captures meaningful pieces of the HPC, mainstream server, cloud, edge, and perhaps other market segments. OpenPOWER currently reports it has 350-plus members, 150 OpenPOWER-ready certified products and 40 OpenPOWER systems shipping or in development. Kulina has high hopes for expansion and recently discussed his plans for growth with HPCwire. Presented here are a few of his comments.

HPCwire: OpenPOWER generally been seen as IBM centric and mostly a way to push its chips and systems. Is that wrong?

James Kulina: I actually would agree with you that historically the connotation has been this is an IBM pet project. If you talk to IBM, they dont want that. I think the announcement last year actually sets us up for that the next chapter where we can actually go and be this fully independent entity, where we can build out an ecosystem, both hardware, silicon hardware, systems and software. Being under the Linux Foundation gives us access to the latest and greatest open source software technologies and knowledge of how to actually build out really sustainable ecosystems.

So thats kind of where I come from anyway. We want to be viewed as an independent entity. We want people to come to the OpenPOWER Foundation to adopt the technology, the Power ISA, and all the other peripheral technologies that can spin out of that. We dont want it viewed as, were coming to IBM first and OpenPOWER is just this thing that sits off the side. Thats not how its going to be moving forward.

Were actually going to be proactive in our approach towards building out an ecosystem. To get to the idea that I framed around silicon, thats part of it, right. We need to have more people actually building chips. I think before a lot of companies were just waiting to see what IBM would do with its silicon with the power architecture. But now, including geopolitical [forces] and whatnot, you see a lot of lot of interest in having a fully open source, architecture and ISA to adopt and actually develop [their] own domestic chips. [For example], a lot of companies out of China and other areas are already asking about power, and Im very interested in it. [That] will lead to really good things because anybody can get involved in taking the Power ISA and customize as long as they stay within compliance.

HPCwire: That seems like a lot to tackle. What makes you think OpenPOWER can pull off a successful restart that avoids past missteps?

James Kulina: Ive been following OpenPOWER for a while now, actually, since it started. Im very interested in hardware in general and early in my career I did hardware design. I always wanted to get back into it and have always thought the success of open source software could now lead to [success in] open source hardware. The software folks have [built] roadmaps. It is different because hardware is different. The cost structure is different how you sell and buy and all that stuff but theres a lot of learnings that we can we can utilize in terms of talking to actual silicon vendors.

Ive had preliminary talks about who we might want to go after, and what we might want to want to do with them. But its really going to be around showing the value first of the architecture showing the use cases that it can benefit from Power. As well as I think software is a very key component in making sure that people dont have any reservations about adopting a new, a new platform. So our goal, my view is our goal as a foundation is to de-risk as much as possible, to think three to four steps ahead, and to try to take out those any of those barriers that might be preventing people from adopting the technology.

HPCwire. What do you see as the advantages the Power ISA set brings to the market. What distinguishes it from existing architectures, particularly given we seem be entering an emerging era of processor diversity with many aspirants?

James Kulina: The first [advantage] is that it is a mature technology. Its not still being ratified, like RISC-V. Now RISC-V is a great project and were happy to be involved with them, but its still nascent. The Power architecture is, you know, running the two fastest supercomputers in the world already. It has a proven software ecosystem, although I still think that that can be drastically enhanced in terms of what type of software is running. We need to cozy up more towards the things that a lot of your average developers would be interested in, not just HPC developers. Theres a lot of room for growth there and a lot of interest.

So OpenPOWER is a mature ecosystem. Its the fact that we have the OpenCAPI initiative. Hopefully that will be merging with the other 410 or however many other [standards efforts] that are basically doing the same thing CXL (Compute Express Link) and all that. Its things that we have that are already fairly mature and production grade and enterprise grade, and fully open source. One great thing we have is that you are patent-protected under the structure we have. That makes it a lot easier to for people to actually adopt and consume [OpenPOWER] technology and to add extensions.

HPCwire: Can you give us a few examples of specific use case areas that you that you think Power is well suited for?

James Kulina: The early the low hanging fruit still is HPC and enterprise but there are others we want to understand better. Im still coming up to speed on them where do we actually stand in other use cases and other segments, such as telecommunications, such as networking, such as edge computing, AI, and, and the like. This is where by being part of the Linux Foundation we get an inside track into all of those segments because theyre leaders in driving software thats running these [segments]. Were already starting conversations with top leaders and those projects to see how can we create a feedback loop between us so we know what we need to start building into our technology so that their members can actually adopt.

HPCwire: How about nearer term plans for the Power10 processor? Wed been expecting either more Power9 options or the Power10 introduction by now. Arm and AMD have been gaining momentum while the Power chip line seems stuck.

James Kulina: Im not really plugged into what IBM is doing there? Youd have to talk to IBM directly about that. I will say its going to be a fantastic ship. But you know, this is why we want multiple vendors for silicon. I think the Googles of the world, the hyper scalars of the world, wanted to see multiple vendors as well. And theres a lot of people in the middle, right, that want that kind of technology that dont have the resources and this is where I think we can have a groundswell of support around a fully open source platform.

HPCwire: It sounds like youre expecting the OpenPOWER platform to expand beyond HPC.

James Kulina: Were not abandoning HPC at all. Theres a lot of work that we can be doing in HPC. I honestly believe a lot of the technologies are happening in the cloud space are going to be pushing into the HPC space. Youre already seeing containers and Kubernetes being adopted in HPC. You know, theyre now adopting things that traditional HPC workload managers have been doing. Theres going to be a merging, I think, of these spaces. Thats actually a good thing because then you dont have silos and can have workloads being fully portable between HPC environments as well as cloud native environments.

So we want to make sure that were have irons in both fires there. But I definitely think theres a lot of momentum with RISC-V. Theres a lot of momentum for Arm and AMD and thats because theyre they have platforms that are readily accessible to the developer community. Thats always been a struggle for the OpenPOWER systems because its such an enterprise grade system. Its such a beast of a system and really can do a lot of great things but we need to have multiple silicon vendors so we can actually gain access to a wider swath of developers.

HPCwire: What sorts milestones are you setting for yourself for the next 6-12-18 months. What are you hoping to get done?

James Kulina: I think in the first couple months we need to think through organizational issues in terms of work groups. How do we architect our work groups? I want them to be more around use cases and segments and not just focused specifically on technologies. It makes it a lot easier for members if they can understand what is whats in it for them, and where they fit and how they can gain value.

The first thing is just getting organized in a way so that we can truly scale it. After that, its more about interfacing with silicon providers, IP houses, as well as fab houses. I think at every layer of the stack in the pipeline we want to see what we can do to make it easier to build new silicon as well as systems and getting the right people in the room to talk things through and have those feedback loops be as efficient as possible. My hope is that within a year or a year and a half, we actually have another silicon provider out there. And, you know, to give us some breathing room in terms of showcasing how an ecosystem can then grow even further from that.

HPCwire: Do you hope to include having silicon targeting broader segments?

James Kulina: Thats my goal. Whether its going to happen in a year or two years might be might be too early.

HPCwire: What do you think are the most pressing challenges?

James Kulina: The first thing is showcasing the use cases and the value proposition and getting organized into a state where we can scale. The next thing is more about access. So getting systems to people and getting to developers in particular, getting integrated with all the key open source groups and making sure that Power architecture is a first class citizen; its already a first class citizen in the kind of languages and the kind of software that people are actually caring about. Then its getting silicon providers on board and investigating power and then hopefully adopting power and producing new chips.

Theres also an education piece, a curriculum piece that we need to focus on as well. We have a fully open source platform, which is ready and its very useful in the academia world as well as in the commercial space, you can have people go from the nuts and the bolts all the way up through the sack and see how everything works. Were working with a number of universities to figure out how we can actually put together a curriculum. And how can we make that accessible not just for the students there, but also globally. Theres a lot of stuff that we can do in terms of, you know, events and meetups, and hackathons and all that.

HPCwire: Are there any specific synergies between Red Hat (IBM) and OpenPOWER?

James Kulina: Well, they (Red Hat] have an internal multi-architecture group and are already testing across the Power architecture. Theres a lot more we can be doing. It boils down, again, to getting access to Power systems. I think that a lot of Red Hatters dont have access to it. So this is one of the things now that Red Hats part of IBM that will help out.

HPCwire: Could share a little bit more about how you came into the position?

James Kulina: Ive been following open power for quite some time and I saw the announcement last year. In my previous role, I co-founded an open source cloud startup called hyper.Sh and we were acquired last year by Ant Financial, which was Alibaba. I was going through that transition and they eventually asked me to relocate to China. I said no for a number reasons and took a couple months off and saw that OpenPOWER had announced it fully open sourced the Power ISA. I reached out to Hugh [Blemings, former OpenPOWER executive director and now an OpenPOWER board advisor), who Ive known for a while through the open source ecosystem, and said, Do you need any help because this is, this is awesome; it is actually what needed to happen six years ago. Hugh said, actually, we do and it kind of kicked off from that.

Brief Bio of James KulinaJames is Executive Director of the OpenPower Foundation, with over 10 years of open source experience across hardware, software, and network engineering disciplines. James brings a passion for open source and is committed to growing OpenPower Foundations membership, community, and ecosystem.He is a serial entrepreneur with a background in enterprise technology and has worked inroles spanning operations, business development, product management, and engineering.

Previously, James was co-founder and COO at Hyper.sh, an open source cloud-native virtualization startup acquired by Ant Financial. Prior to that, he led product management in Red Hats OpenStack group, and was a product lead on AT&Ts first OpenStack Cloud.James graduated from University of Virginia with a degree in Electrical Engineering and is based in New York.

Link to article on the A21 core design just put into open source: https://www.hpcwire.com/off-the-wire/a2i-power-processor-core-contributed-to-openpower-community-to-advance-open-hardware-collaboration/

More:
OpenPOWER Reboot New Director, New Silicon Partners, Leveraging Linux Foundation Connections - HPCwire

Baidu Joins the Open Invention Network Community – GlobeNewswire

DURHAM, N.C., June 30, 2020 (GLOBE NEWSWIRE) -- Open Invention Network (OIN), the largest patent non-aggression community in history, and Baidu, the largest Chinese language search engine and one of the leading artificial intelligence (AI) companies in the world, announced today that Baidu has joined as a community member. As an active supporter of open source and an important contributor of global open source technology, Baidu is committed to promoting the rapid development of AI through an open source platform and facilitating industrial transformation.

Artificial intelligence-driven and internet-based services continue to spawn new industries while advancing business performance through actionable intelligence. As a global leader in internet and AI-related services and products, Baidu recognizes the benefits of shared innovation inherent in open source, said Keith Bergelt, CEO of Open Invention Network. We are pleased Baidu has joined our community and committed to patent non-aggression in Linux and adjacent open source technologies.

Baidu is and will always be a strong supporter and participant of open source," said Cui Lingling, the head of Baidus Patent Department, Baidu has launched a number of open source platforms including Apollo (Autonomous Driving Platform), PaddlePaddle (Parallel DistributedDeep Learning) and the like, and has been actively building patent cooperation. Baidu is a world-leading artificial intelligence platform company. Baidus participation in the OIN community shows our consistent commitment to open source innovation. Baidu will continue to support Linux patent protection and help foster a healthy Linux ecosystem."

OINs community practices patent non-aggression in core Linux and adjacent open source technologies by cross-licensing Linux System patents to one another royalty-free. Similarly, OIN licenses its patents royalty-free to organizations that agree not to assert their patents against the Linux System. The OIN license can be signed online at http://www.j-oin.net/.

About BaiduFounded on January 1st, 2000, Baidu is a leading search engine, knowledge and information centered Internet platform and AI company. With over 1.1 billion monthly active devices running Baidu mobile apps, Baidu is the primary platform for Internet users to access Chinese information and responds to billions of search requests from more than 100 countries and regions daily.

Baidus story began when Co-Founder of the company, Robin Li, was awarded a U.S. patent for his initial development of the Rankdex site-scoring algorithm for search engine page rankings, making China one of only four countries in the world with core search engine technologies in addition to the U.S., Russia and South Korea. According to the Patent Protection Association of China, Baidu held 5,712 AI patents in 2019, the most in China.

Baidu keeps technological innovation at the heart of its business and has been a global leader in innovation investment, R&D and talent acquisition. With its mission to make the complicated world simpler through technology, Baidu is committed to providing products and services that better understand users and promotes constant technological innovation. Through years of investment and development and the companys global leadership in deep learning, Baidu rank fourth on the list of the top five global AI companies by Harvard Business Review in 2019 and is the only company on the list in China.

PaddlePaddle, as the only independent R&D deep learning platform in China, has been officially open-sourced to professional communities since 2016. DuerOS, as an open operating system, has released an open platform, and built a voice ecosystem, and also provided support for third party integration. Baidu Cloud primarily provides AI solutions, cloud infrastructure and other services to enterprises and individuals. Apollo, as the world's largest open-source autonomous driving platform, supports commercial production of autonomous driving vehicles and incorporates autonomous driving capabilities, including valet parking.

Under the strategy of strengthening the mobile foundation and leading in AI, Baidu has built an increasingly prosperous and powerful mobile ecosystem and steadily improved its AI ecosystem with accelerated commercialization.

About Open Invention NetworkOpen Invention Network (OIN) is the largest patent non-aggression community in history and supports freedom of action in Linux as a key element of open source software (OSS). Patent non-aggression in core technologies is a cultural norm within OSS, so that the litmus test for authentic behavior in the OSS community includes OIN membership. Funded by Google, IBM, NEC, Philips, Sony, SUSE and Toyota, OIN has more than 3,200 community members and owns more than 1,300 global patents and applications. The OIN patent license and member cross-licenses are available royalty-free to any party that joins the OIN community.

For more information, visit http://www.openinventionnetwork.com.

Media-Only Contact:

Ed SchauwekerAVID Public Relations for Open Invention Networked@avidpr.com+1 (703) 963-5238

Here is the original post:
Baidu Joins the Open Invention Network Community - GlobeNewswire

ModalityIQ Announces Acquisition of The Machine Learning Conference – PR Web

NEW YORK (PRWEB) July 03, 2020

ModalityIQ, a community-based knowledge sharing and learning platform, is pleased to announce the acquisition of The Machine Learning Conference (MLconf and MLconf.com), a leader in serving the Machine Learning and Artificial Intelligence community.

Founded by Courtney and Shon Burton in 2012, MLconf has gathered thousands of members of the machine learning community to coalesce and share lessons learned through a series of 20+ conferences that showcase the latest innovations in machine learning tools, techniques and algorithms attracting Data Scientists, Machine Learning professionals and others who have a vested interest in ML/AI.

Sharing lessons learned that help advance the ML/AI community is at the core of ModalityIQs mission and is why the acquisition of MLconf made sense for us, said Richard Rivera, Founder & CEO of ModalityIQ. Courtneys years of service and her dedication to providing a platform for the ML/AI community to share, learn, and inspire one another made her the obvious person for us to partner with. We look forward to working with her and the MLconf team to continue to facilitate the sharing of knowledge within the ML/AI community today and in the years to come.

In addition to hosting annual conferences, MLconf also has a vibrant online presence through a community blog and job board at mlconf.com, a monthly newsletter, and via social media in addition to a prolific YouTube channel with hundreds of videos highlighting current and past MLconf presentations.

For some time, Ive seen areas of opportunity for MLconf to further engage the ML/AI community and expand our focus to facilitate the sharing of knowledge and lessons learned and I believe partnering with ModalityIQ will help elevate and enrich our mission, said Courtney Burton. ModalityIQs founding principle of fostering learning through collaboration is the same principle I founded MLconf on and I am excited to join the ModalityIQ team.

Brian DeCicco from Berkery Noyes served as the exclusive financial advisor to MLconf.

About ModalityIQModalityIQ is a community-based knowledge sharing and learning platform founded to facilitate the sharing of knowledge and know-how across the advanced and emerging technology landscape by providing an environment for exchanging ideas and inspiring open dialogue to advance technological capabilities and knowledge-gain in business and in people. For more information, visit http://www.modalityiq.com.

Share article on social media or email:

Link:
ModalityIQ Announces Acquisition of The Machine Learning Conference - PR Web

Challenges facing data science in 2020 and four ways to address them – TechRepublic

Finding value in data, integrating open source software, a small talent pool, and ethical concerns around data were found to be trouble areas in a new state of data science report.

Data volume analysis and computer science industry.3d illustration

Image: Getty Images/iStockphoto

A report on the state of data science from software firm Anaconda finds that data science is anything but a stable part of the enterprise. In fact, it has several serious challenges to overcome.

SEE: Tableau business analytics platform: A cheat sheet (free PDF download) (TechRepublic)

Luckily, Anaconda's report provides four recommendations organizations should focus on to address problems it found in its survey of data science professionals: A lack of value realization, concerns over the use of open-source tools, trouble finding and retaining talent, and ethical concerns about bias in data and models.

"The institutions which rely on [data science] are still developing an understanding of how to integrate, support, and leverage it," the report said.

The four trouble areas that Anaconda found are keys in the continued evolution of data science from an emerging part of enterprise business to a fundamental part of planning for the future of work.

This problem stems mainly from production roadblocks like managing dependencies and environments, a lack of organizational skills needed to deploy production models, and security problems.

Combined, those three problems lead to 52% of data science professionals saying they have trouble demonstrating the impact data science has on business outcomes. This varies across sectors, with healthcare data pros having the most trouble proving benefits, where 66% said they sometimes or never can do so, to consulting, where only 29% said the same.

"Getting data science outputs into production will become increasingly important, requiring leaders and data scientists alike to remove barriers to deployment and data scientists to learn to communicate the value of their work," the report recommends.

According to the report, open-source programming language Python dominates among data scientists, with 75% saying they frequently or always use it in their jobs.

Despite the popularity of open-source software in the data science world, 30% of respondents said they aren't doing anything to secure their open-source pipeline. Open-source analytics software is preferred by respondents because they see it as innovating faster and more suitable to their needs, but Anaconda concluded that the security problems may indicate that organizations are slow to adopt open-source tools.

"Organizations should take a proactive approach to integrating open-source solutionsinto the development pipeline, ensuring that data scientists do not have to use their preferred tools outside of the policy boundary," the report recommended.There's a caveat to mention here: Anaconda is the manufacturer of a Python-based open-source data science platform. The results of its survey may be tilted in favor of open-source products since people surveyed were recruited via social media and Anaconda's email database.

There are several layers of problems to parse through here. First, the report found that what students are learning and what universities are teaching isn't necessarily what enterprises need from new data scientists.

The two most frequently cited skill gaps by businessesbig data management and engineering skillsdidn't even rank in the top 10 skills universities are offering their data science students.

Another layer of problems comes in talent retention, which the report found is closely tied to how often data science professionals are able to prove the value of their work. Across the board, however, 44% data scientists said they plan to look for a different job within the next year.

Anaconda makes three recommendations to address this problem:

"Of all the trends identified in our study, we find the slow progress to address bias andfairness, and to make machine learning explainable the most concerning," the report said.

Ethics, responsibility, and fairness are all problems that have started to spring up around machine learning and artificial intelligence, and Anaconda said enterprises "should treat ethics, explainability, and fairness as strategic risk vectors and treat them with commensurate attention and care."

Despite the importance of addressing bias inherent in machine learning models and data science, doing so isn't happening: Only 15% of respondents said they had implemented a bias mitigation solution, and only 19% had done so for explainability.

Thirty-nine percent of enterprises surveyed said they had no plans to address bias in data science and machine learning, and 27% said they have no plans to make the process more explainable.

"Above and beyond the ethical concerns at play, a failure to proactively address these areas poses strategic risk to enterprises and institutions across competitive, financial, and even legal dimensions," the report said.

The solution that Anaconda recommended is for data scientists to act as leaders and try to drive change in their organizations. "Doing so will increase the discipline's stature in the organizations which depend on it, and more importantly, it will bring the innovation and problem-solving, for which the profession is known, to address critical problems impacting society."

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays

The rest is here:
Challenges facing data science in 2020 and four ways to address them - TechRepublic

New Training Course Aims to Make it Easy to Get Started with EdgeX Foundry – Container Journal

Course explains what EdgeX Foundry is, how it works, how to use it in your edge solutions, leveraging the support of LF Edges large ecosystem

SAN FRANCISCO, July 1, 2020The Linux Foundation,the nonprofit organization enabling mass innovation through open source, today announced the availability of a new training course,LFD213 Getting Started with EdgeX Foundry.

LFD213, was developed in conjunction withLF Edge, an umbrella organization under The Linux Foundation that aims to establish an open, interoperable framework for edge computing independent of hardware, silicon, cloud, or operating system. The course is designed for IoT and/or edge software engineers, system administrators, and operation technology technicians that want to assemble an edge solution.

The course covers how EdgeX Foundry is architected, how to download and run it, and how to configure and extend the EdgeX framework when needed. The four chapters of the course, which take approximately 15 hours to complete, provide a basic overview, a discussion of device services, which connect physical sensors and devices to the rest of platform, application services, how to send data from EdgeX to enterprise applications, cloud systems, external databases, or even analytics packages, and more.

Hands-on labs enable students to get and run EdgeX and play with some of its important APIs, as well as create a simple service (either device or application service) and integrate it into the rest of EdgeX.

EdgeX Foundryis an open-source, vendor-neutral, hardware- and OS-agnostic IoT/edge computing software platform that is a Stage 3 (Impact) project under LF Edge. In the simplest terms, it is edge middleware that sits between operational technology, physical sensing things and information technology systems. It facilitates getting sensor data from any thing protocol to any enterprise application, cloud system or on-premise database. At the same time, the EdgeX platform offers local/edge analytics to be able to offer low latency decision making at the edge to actuate back down onto sensors and devices. Its microservice architecture and open APIs allow for 3rd parties to provide their own replacement or augmenting components and add additional value to the platform. In short, EdgeX Foundry provides the means to build edge solutions more quickly and leverage the support of a large ecosystem of companies that participate in edge computing.

EdgeX Foundry is on a phenomenal growth trajectory with multiple releases and millions of container downloads, said Jim White, EdgeX Foundry Chair of the Technical Steering Committee and CTO of IOTech Systems. Given the scale of the adopting community and ecosystem, it is critical that there is proper training available to allow new adopters and prospective users to learn how to get started. The new training, created by the architects of EdgeX Foundry and managed by The Linux Foundation, will allow developers exploring EdgeX a faster and better path to understand and work with EdgeX while also accelerating our projects adoption at scale.

The course is available to begin immediately. The $299 course fee provides unlimited access to the course for one year including all content and labs. Interested individuals may enrollhere.

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the worlds leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundations projects are critical to the worlds infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundations methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us atlinuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see its trademark usage page:www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

Related

Continued here:
New Training Course Aims to Make it Easy to Get Started with EdgeX Foundry - Container Journal

Custom Packet Sniffer Is A Great Way To Learn CAN – Hackaday

Whilst swapping out the stereo in his car for a more modern Android based solution, [Aaron] noticed that it only utilised a single CAN differential pair to communicate with the car as opposed to a whole bundle of wires employing analogue signalling. This is no surprise, as modern cars invariably use the CAN bus to establish communication between various peripherals and sensors.

In a series of videos, [Aaron] details how he used this opportunity to explore some of the nitty-gritty of CAN communication. In Part 1 he designs a cheap, custom CAN bus sniffer using an Arduino, a MCP2515 CAN controller and a CAN bus driver IC, demonstrating how this relatively simple hardware arrangement could be used along with open source software to decode some real CAN bus traffic. Part 2 of his series revolves around duping his Android stereo into various operational modes by sending the correct CAN packets.

These videos are a great way to learn some of the basic considerations associated with the various abstraction layers typically attributed to CAN. Once youve covered these, you can do some pretty interesting stuff, such as these dubious devices pulling a man-in-the-middle attack on your odometer! In the meantime, we would love to see a Part 3 on CAN hardware message filtering and masks [Aaron]!

View original post here:
Custom Packet Sniffer Is A Great Way To Learn CAN - Hackaday

Tim OReilly – COVID-19 is an opportunity to break the current economic paradigm – Diginomica

(Image credit Tim O'Reilly)

Tim O'Reilly has played a critical part in framing some of the most influential conversations about the role of technology in economies and across society since the early 1990s. Concepts and movements such as open source software, Web 2.0, Government-as-a-Platform and the WTF Economy are all well known and referenced widely within the technology industry. In other words, when O'Reilly speaks up about something, people tend to pay attention.

Given the fundamental shifts we are seeing across the economy now and the rapid escalation of using digital tools to counter the effects of the COVID-19 pandemic, it's unsurprising that O'Reilly has some opinions.

Just as a disclaimer, the following ideas have been selected from a wide-ranging conversation that covered a variety of topics (including at one point, O'Reilly wrangling some chickens - no, really). But I feel that almost all the talking points played into the same overarching theme - COVID-19 has shown us that drastic change is possible when there is enough will and force used. With this in mind and knowing that the status quo doesn't have to be sustained, what sort of society would we like to build going forward?

O'Reilly said:

I think the impact of the pandemic is sort of meta, in that it has simply told us, loud and clear, that the way things are can change. We've had this big resetting of the Overton window in politics in recent years and now we're having this big reset of the Overton window in the economy.

This is just the beginning of changes, not the end. A lot of people frame this up as What happens post pandemic?'. I don't think that's the right way to think about it. This is the century of things that we can imagine could happen but never really took seriously and never prepared for actually happening. And that's a big deal.

O'Reilly said that people and commentators always point to the digital revolution' over the past decade or so as a period of unprecedented change. However, he believes that this is not accurate and there have been similar disruptive developments in recent history, for example the period between 1890-1930. In fact, O'Reilly argues that the digital change we have been experiencing, whilst meaningful, was pretty continuous with what went before.

But COVID-19 is different, and because of that, we now have an opportunity to build a collective consensus on how to shape society and the economy going forward. He said:

I think we in the developed world are facing our first serious period of change in a way that we have not seen before, for a very long time. And because everything is up for grabs, I think there is a real opportunity and a requirement to shape that. Instead of just taking whatever we get.

Social, racial and economic equality are front of mind for O'Reilly. So too is the urgency around climate change. O'Reilly spoke about capitalism being the best of the worst economic systems, but argued that we don't have to accept it in its current form and we can use the technology we have available to us to build something better (more on that later). However, it will require a conscious effort to drive the change we want to and ought to see. This will come eventually, he said, but it would be better that it happened sooner rather than later.

This is why I'm excited in a way because it's breaking the old paradigm. I don't think it's entirely going to go away and I don't think it's going to go away easily. I say to people, look we can have a positive 30/40/50 years, or we can have a really negative one before we wake up. We can rise to the occasion and put the machines to work alongside us or we can keep building this fundamentally trivial consumer economy, where we are making stuff that nobody really wants and throws away.

But how do you build this consensus for change? That's not an easy question to answer, particularly in a world where divisions between ideas and fields of thought are growing wider by the day and lines are being drawn left, right and centre. O'Reilly is of the view that we can't expect society as a whole to just understand what truth' is within the context of swathes of information being distributed online, via often unknown sources. People are influenced easily and we need to develop tools and educate ourselves on discerning truth from fiction. He said:

We can't just let people go off into these disjunct realities and then hate on each other. I'm not sure how we get back to that, but we are going to have to. I do think that through the power of, in some sense, persuasion - for example, in America Donald Trump persuaded a group of people that a set of feelings were okay to express. And now a group of people associated with Black Lives Matter has persuaded a different group of people to express and to have solidarity. What you see are these vast contests for human belief. These media idea storms are the future.

I think one of the most important technologies that we're going to have to develop, is that you can't rely on people to be media literate. You can't rely on people to sort out truth from falsehood. A lot of people say Facebook's algorithm is the problem - yes, Facebook's algorithm is the problem today, but it's also the solution. I feel very strongly that there has to be more curation, not less.

These are all big ideas and it can sometimes seem difficult to pinpoint exactly what kind of change we should be striving for. One area of particular interest for O'Reilly, unsurprisingly, is intelligent machines, AI and algorithmic systems. He is adamant that the fundamental skill that society has to get better at in the 21st Century is partnering with intelligent machines - instead of driving out human capital to reduce cost, we need to think about how these intelligent systems can be used to reshape the economy (where the driver isn't just share price).

O'Reilly said that he often uses a quote by Dr. Paul Cohen, the founding Dean of School of Computing and Information at the University of Pittsburgh, which is:

The opportunity of AI is to help humans model and manage complex interacting systems.

We need to look at our economies through this lens, O'Reilly argued.

We are engaged in this massive project to rebuild our economy, with all of the signals we have today, rather than the signals that we had 100 years ago.

The algorithm of our financial markets is to maximise corporate profits and stock share prices and humans are a cost to be eliminated. And then you say, well why do we have this society of inequality and inequity? It's because we built a machine and told it to optimise for that. I think where we are right now is that we are at a moment where we can recognise the choices that we've made in building the society we built.

Again, COVID-19 is playing a significant role in driving this shift in thinking forward, given that people are recognising that the way governments find and spend money is down to political choice. For example, the US government arguing that there is no money for universal healthcare, but then finding trillions of dollars to prop up industries and the stock market.

We need to get better at using these new intelligent systems to reshape the economy in a way that works for everyone. O'Reilly said:

The fundamental skill we have to get better at in the 21st Century is partnering with intelligent machines. It's easy to see things like Amazon's next day delivery or Uber and their ilk through the lens of current broken labour markets. But you could look at them through the lens of this massive algorithmic coordination of human effort. We are in the early stages of that.

I found this conversation with O'Reilly fascinating in many ways. Writing this story was a challenge, given that the ideas are so big and almost seem incomprehensible within the current system that we operate. But I think O'Reilly is right, COVID-19 has highlighted that change is possible and we *do* have a choice and we *do* have control over how we shape the economy. Building a consensus over what sort of economy and society we'd like to have isn't an easy thing to do, but I think it's becoming clear to many that the current system we have in place isn't working for a significant chunk of people. We need to focus our efforts on coalescing people around real change that lifts us all, rather than getting distracted with disinformation. And we need to stop assuming that access to economic opportunity isn't gated in many ways, because it is.

I'll finish with the following quote from O'Reilly:

I think there are more COVID-like wildfires in our future. So our ability to respond is going to be super important. There are some really important things about capitalism - in many ways it's the worst economic system, except for all the rest. But that doesn't mean it can't be improved. And part of what makes it better is more perfect knowledge. We have tools for knowledge and coordination that we didn't have before.

We are now building systems at scale and shape what billions of people believe, new kinds of systems. We need to understand how to use those systems effectively. We do need to redirect our economy in some pretty fundamental ways, but I have more hope that we're actually going to be able to do that than I've ever had before. We've seen that it's possible to do it in different ways.

Read this article:
Tim OReilly - COVID-19 is an opportunity to break the current economic paradigm - Diginomica

Managing while invisible: how the gig economy shapes us and our cities – Qrius

The gig economy is full of disruptive technological darlings. Uberrevolutionised how we used taxis, AirBnBchanged the hospitality market forever, while Deliveroo has a substantial impact on how cities develop and change and how we use our city space. Their impact,we argue, is a consequence of one of their most important inventions: how to look like theyre invisible. It is by making themselves invisible that they redefine social responsibilities. This is their basicmodus operandi(MO), which theyput forwardand applyagain, andagain, most recently, todeny employee rights to their workers. This MO is based on their effortful attempt to act and manage invisibly, which is a political act. We look at Uber for evidence of such invisible management.

We draw these conclusions from our analysis of two UK court cases, one in the High Court of Justice in 2015, and the following major Uber case in the UK that took place in the Central London Employment Tribunal in 2016. These cases are interesting because they reveal how the judges have to navigate the law to rule on concepts that werent thought of when the laws were written. Quite a challenge indeed!

The first court case was a ruling from the High Court of Justice in October 2015. The judge had to consider whether Uber was a taxi service, and hence, a transport service and not a technology company. The key object in that issue was whether the app could be considered a taximeter or not. What is a taximeter? It is defined as a calculative device that must be for the calculation of the fare. Yes, clients exchange money with drivers to take them from one point to another and this is displayed on the clients and drivers apps. The calculation, however, happens in Ubers servers and not in the apps, and so the smartphones are not taximeters, and thus Uber is not a taxi service. Its non-presence in the drivers car allows it to remain a technology company and not a transport service. Uber, then, was just a technological infrastructure that matched people together.

If Uber is not there in the car for calculating fares, its presence is felt in other ways as the second case will show. In the Central London Employment Tribunal in 2016, the judges ruling centres this time on the changing nature of Uber and its position as an intermediary. Uber presents itself as an invisible infrastructure that connects two people and proposes a fare and travel option. An infrastructure is a great analogy for Uber: you dont think about the roads you walk in when you walk them, their purpose or why they are there. You dont wonder where the water pipes that give you water come from or go to: its there and its as if its always been there. Its hard to imagine London without its roads.

So when questions arise whether drivers should be considered as employees and what is Ubers involvement with them, the invisible infrastructure is a great analogy for them because it rationalises their usefulness without them being conspicuously involved; even their fare calculations are unseen. As an infrastructure company, Uber is like a road connecting people together. Their involvement is invisible, you dont question the road you walk on when you go meet someone, do you? However, a series of documents presented to the judge by both the claimants and defendants make the judge unpack the invisible aspect of the infrastructure.

Indeed, Uber imposes upon the drivers the path to take (with ensuing punishment if the drivers fail to take it), monitors the behaviour of drivers (through a rating system), or screens the drivers and their cars at recruitment (black cars are preferred). Many of these conditions and monitoring happens through and by the algorithm. Invisible, yet organising work, Ubers algorithm was deemed to manage people just as a supervisor would.

The law here is a key player in the definition of Uber itself and technology. Before the Central London Employment Tribunals ruling, Uberwasa digital platform, exemplar among the technology companies as a match-maker infrastructure having as much right to be part of our cities as the streets have; an invisible actor connecting people together and drawing up the public space for us. After the ruling, Uberbecamean infrastructure with responsibilities. These can be listed: Uber made sense of the city, mapped it, decided what cars should roam where, what roads to take, what price to pay. Uber did not only match people together, it also became seen as an agent responsible for defining the roles of the people it connected. Through its driver ratings, Uber, for example, would define what a good driver was. The app rating system had an answer, Uber could define the notion of driver from their interactions with the app. Ironically, it is these questions that pushed the two claimants to present their case against Uber: they resisted the apps control over their own understanding of what drivers are, where they should be, and who should judge them.

Uber is an infrastructure different to the roads, the ports, and the pipes in our cities. It is a thinking infrastructure that manages people through our very use. It is important, in our mind, to think beyond digital infrastructures cast as platforms without responsibilities, without agencies. They make people perform certain roles and act in certain and specific ways which may be obscured, obfuscated, or plainly unclear. We have to think about infrastructures beyond just a foundation upon which other things are built, but as infrastructures that create relations and create roles. From this perspective, defining infrastructure becomes a political act. Beyond the promise of efficient matchmaking, what sort of society are such platforms trying to configure? Perhaps, we should also ask ourselves: what sort of society are we willing to see?

Daniel Curto-Milletis a Marie Curie research fellow at the Spanish National Research Council (CSIC), studying the sustainability of open source beyond technical environments. His is interested in the intersection between organisation, technology, and society. He has conducted research on openness as an organisational principle and open source software development. Daniel holds a PhD in Information systems from LSE.Twitter@curtomil.

Roser Pujadasis a research fellow in information systems at LSE, undertaking research on the organisational, managerial and social implications of digital interfaces, as part of the EPSRC-funded projectInterface Reasoning for Interacting Systems (IRIS). She is broadly interested in the social and organisational implications of digital innovation. She has conducted research on the sharing economy, considering the variety of models of economic organisation that digital platforms support, and the ways gig workers navigate and support each other in the sharing economy landscape. Roser holds a PhD in information systems (LSE). Twitter@roserpujadas1.

This article was first published in LSE Business Review

Stay updated with all the insights.Navigate news, 1 email day.Subscribe to Qrius

Read more:
Managing while invisible: how the gig economy shapes us and our cities - Qrius

US issues new ‘highly unusual’ indictment against Assange RSF – The Shift News

Reporters Without Borders has criticised the latest move by the US Department of Justice against Julian Assange after it issued a new indictment against the Wikileaks founder that broadens the scope of the hacking allegations he is facing.

Describing it as highly unusual, the press freedom NGO pointed out that the new indictment, which supersedes the others, widened the scope of the conspiracy claimed in the hacking allegations filed against him.

Assange is facing extradition proceedings in the UK after the US Department of Justice filed 17 counts under the Espionage Act and one charge under the Computer Fraud and Abuse Act (CFAA).

The new indictment did not add new charges but expanded the scope of the CFAA charge and changed the evidential basis of some of the other charges against him, RSF said in a statement.

The NGO, which has been closely monitoring Assanges case since the very start, pointed out that it was highly unusual at this late stage in the extradition case.

The superseding indictment is the latest in a long series of moves by the US government to manipulate legal loopholes in their targeting of Julian Assange, to undermine his defence, and to divert public attention from the extremely serious press freedom implications of his case, RSF Director of International Campaigns Rebecca Vincent said.

The timing was also questioned by Assanges lawyer Mark Summers in an administrative hearing in the UK on 29 June and he expressed surprise that the defence team had learned about the indictment through the press. The new indictment had not yet been sent to Assanges lawyers or the court or been formally filed in the UK proceedings, the NGO said.

Meanwhile, Assange continues to be held at the high-security Belmarsh prison with an ailing health condition which has even led him to miss a hearing earlier in June, raising further concerns about the state of his health and strengthening calls for him to be released, especially in light of the potential exposure to COVID-19.

Last month, a group of 11 current and former statesmen from Europe and the US wrote an open letter to the Lord Chancellor and Secretary of State for Justice Robert Buckland and UK Commons Justice Committee Chair Bob Neill, asking for Assange to be released into home detention with a 24-hour monitoring ankle bracelet.

The politicians supported an urgent appeal by Australian MPs Andrew Wilkie and George Christensen who asked the British authorities to urgently reconsider granting Assange monitored home detention.

The Wikileaks founder has been charged for publishing the Afghanistan and Iraq war diaries and US embassy cables important documents that many journalists around the world used. The War Diaries provided evidence that the US Government misled the public about activities in Afghanistan and Iraq and committed war crimes.

RSF had previously expressed concern regarding the US governments lack of evidence for its charges against Assange and believed Assange had been targeted for his contributions to public interest reporting.

More:

US issues new 'highly unusual' indictment against Assange RSF - The Shift News