Venafi flies off with UK-based Kubernetes startup – www.channelweb.co.uk

Cybersecurity vendor Venafi is set to acquire London-based Kubernetes security specialist Jetstack.

Jetstack was founded five years ago and specialises in open source machine identity protection software for Kubernetes and cloud native ecosystems.

The acquisition comes hot on the heels of VMware's acquisition of another Kubernetes security outfit, Octarine.

"In the race to virtualise everything, businesses need faster application innovation and better security; both are mandatory," said Jeff Hudson, Venafi CEO.

"Most people see these requirements as opposing forces, but we don't. We see a massive opportunity for innovation.

"This acquisition brings together two leaders who are already working together to accelerate the development process while simultaneously securing applications against attack, and there's a lot more to do. Our mutual customers are urgently asking for more help to solve this problem because they know that speed wins, as long as you don't crash."

Jetstack's most popular open source software is cert-manager, which allows developers to quickly create, connect and consume certificates with Kubernetes and cloud native tools.

The two firms have been working together for the past two years on creating next generation machine identity protection in Kubernetes, as well as in the multi-cloud, service mesh and microservices ecosystems.

Matt Bates, CTO and co-founder of Jetstack, said: "We developed cert-manager to make it easy for developers to scale Kubernetes with consistent, secure and declared-as-code machine identity protection.

"The project has been a huge hit with the community and has been adopted far beyond our expectations. Our team is thrilled to join Venafi so we can accelerate our plans to bring machine identity protection to the cloud native stack, grow the community and contribute to a wider range of projects across the ecosystem."

Read more:
Venafi flies off with UK-based Kubernetes startup - http://www.channelweb.co.uk

The Pandemic Is Turbo-charging Government Innovation: Will It Stick? – Knowledge@Wharton

As the U.S. government steps in to address the fallout from the COVID-19 pandemic, solutions can and should be designed that realize long-term benefits and capture the full potential of technology innovation in the years to come, write the authors of this opinion piece John Goodman, Chief Executive Officer of Accenture Federal Services, and Ira Entis, Managing Director of Accenture Federal Services Growth and Strategy. The trick for making these solutions stick, they say, is transitioning from a focus on modernization to a culture of continuous innovation.

The worlds getting more crowded, more interconnected, more complicated. Not surprisingly, the challenges the U.S. confronts today are increasingly complex, fast-shifting, and harder if not impossible to tackle with traditional tools and methods. Just consider persistent challenges like cybersecurity, healthcare, and the one we are all most concerned about right now: the COVID-19 pandemic and its many troubling repercussions. As government steps in to address these issues, solutions can and should be designed that realize long-term benefits and capture the full potential of technology innovation in the years to come.

Challenges like these really bring into focus the many critical roles that government agencies play in our lives. The current environment underscores how important it is that our government operate with the latest innovations and capabilities in hand. Government leaders need data-derived insights at their disposal and advanced technology tools that allow them to move rapidly and collaboratively as their mission-focused workloads require.

For policy makers, todays COVID-19 crisis is heightening the urgency for government modernization: The massive $2 trillion coronavirus aid package passed by Congress in March includes $340 billion in new government appropriations, much of which will go toward government telework, telehealth, cybersecurity, and network bandwidth initiatives.

Just as many companies are employing artificial intelligence, machine learning, and advanced data analytics to rapidly model potential COVID-19 vaccines and anticipate where the pandemic is likely to spread, so too is government turning to innovation for assistance. To cite just a few examples:

As government increasingly adopts and applies innovative technologies whether in response to specific crises or to address other mission needs we all reap benefits down the road, many of which are often unforeseen at first. Innovations that are ubiquitous today, including the Internet, GPS, touchscreen display, smart phones, and voice-activated personal assistants, all stem from government investments.

Yet this impressive legacy of government-driven innovation often gets overshadowed by the outdated technology and archaic business processes that are still embedded across the government. Many federal agencies still rely on legacy IT systems and business processes some of which are decades old to run their day-to-day operations and businesses critical to national security and quality of life.

Government leaders need data-derived insights at their disposal and advanced technology tools that allow them to move rapidly and collaboratively as their mission-focused workloads require.

Until recently, government agencies have been relatively slow in adopting emerging technologies and commercial best practices cloud computing, artificial intelligence, robotic process automation (RPA), human-centered design, and customer experience, to name a few that have been powering positive disruptions in the commercial sector for years.

In the governments defense, there have been legitimate reasons for that delay. Federal agencies have been slowed in their efforts to embrace modern technologies and practices by archaic procurement systems. They have tried to adapt commercial-off-the-shelf (COTS) software to rigid, complex, decades-old internal business processes that are often rooted in law and shaped by highly prescriptive compliance regulations. Given the highly sensitive nature of their data and missions, they have legitimate privacy and security concerns to sort out with many commercial technologies. The changing nature of work has also added workforce reskilling costs when onboarding new technologies and practices. And then consider the scale and complexity of many government operations and missions that would make the application of new technologies challenging under any circumstance.

It turns out that the governments delays in technology adoption in the last two decades may have worked to its favor. Commercial tech companies during that time have steadily improved the security, portability, scale, ease of use, and interoperability of their offerings. Emerging capabilities such as software-defined everything, virtualization, containerization, open source software, API connectivity, advanced encryption, advanced data visualization, robotic process automation (RPA), and machine learning have evolved in recent years to the point where commercial technologies today are exceedingly more adaptable to government needs and use cases.

And, to their credit, federal agencies throughout that period have been busy overhauling their outdated bureaucratic approaches, refreshing their modernization goals, and overcoming many of the tech adoption hurdles they previously faced. Federal leaders have persistently and effectively pressed for reforms, experimentation, and workarounds to smooth out the many points of friction that have slowed modernization progress.

Government Transformation Is Already Under Way

Consequently, government today is truly poised to achieve transformational change. In fact, we are already seeing progress toward this.

Most importantly, federal agencies are turning the corner in their adoption of commercial cloud services. Cloud computing has emerged as the centerpiece of federal IT modernization efforts, and agencies now fully recognize the power of the cloud to help them improve and expand their mission capabilities, increase agility and responsiveness, contain costs, and enhance security. In short, federal agencies correctly view cloud adoption as pivotal in their ability to leap-frog from being technology laggards to technology leaders.

As a result, we are seeing agencies make unprecedented investments in cloud services: Federal cloud spending, which totaled $2.4 billion in 2014, has steadily accelerated to a projected $7.1 billion for this fiscal year, according to Bloomberg Government. Agencies, such as the departments of Health and Human Services, Veterans Affairs, and the Air Force, are leading the charge on cloud, while others, such as the departments of Education, Treasury, State, and Agriculture, are embarking on enterprise-wide IT modernizations today.

Importantly, these cloud investments are now unlocking billions of dollars in spending on cloud-enabled digital services, artificial intelligence (AI), and other capabilities that can help agencies translate their vast, accumulating stores of data into mission-advancing insights and operational efficiencies. Civilian and defense agencies across government are steadily expanding their annual investments in AI and web and mobile-based government services from roughly $4 billion in 2016 to a conservative estimate of more than $6.6 billion projected for this fiscal year.

Government today is truly poised to achieve transformational change. In fact, we are already seeing progress toward this.

With cloud capabilities in place, agencies are positioned to use emerging technologies to tackle case management backlogs, improve citizen services, and deliver more holistic responses to todays emerging and complex challenges, such as multi-domain operations in defense, public health, cybersecurity, and the transitioning economy.

Government agencies are also beginning to apply private sector concepts like human-centered design, customer experience, and behavioral science in their service deliveries. For example, more than two dozen agencies ranging from the IRS and Transportation Security Administration (TSA) to the Office of Federal Student Aid are designated as high-impact service providers which means they must measure and improve the impact of the customer experiences they deliver to citizens. Likewise, agencies this year plan to accelerate their adoption of robotic process automation (RPA) or bots to automate and modernize repeatable business processes, so they can transition employees to higher-value work.

We are also now seeing steady rollouts of modernized citizen-facing digital experiences. For example, the U.S. Patent and Trademark Office is developing AI-enabled research tools to help its staff process patent and trademark applications more efficiently. Likewise, Health and Human Services plans to employ AI, intelligent automation, blockchain, micro-services, machine learning, natural language processing, and RPA to support services like medication adherence, decentralized clinical trials, evidence management, outcomes-based pricing, and pharmaceutical supply chain visibility.

These rollouts represent meaningful advances in the way government is stepping up to the challenge of modernization. But perhaps the hardest challenge lies ahead, which is this: Government agencies will need to think about and metabolize innovation in fundamentally different ways going forward. Cloud, AI, and automation cannot be thought of as a one and done modernization project or even as a series of projects. Technologies will constantly advance, so agencies need a different mindset that views innovation as a non-stop journey of continuous evolution and adaptation. This means government agencies need to re-orient their strategic planning, budgeting, and cultures to think and plan in those terms.

Other new challenges will emerge as well. For example, the government needs to address how to maximize the use of its data and ensure that the data-driven capabilities they deploy are ethical, unbiased, and intuitive to use. In fact, we already see new tools in law enforcement, housing, commercial banking, and human resources that address the ethical use of AI. The Defense Department has also issued its own draft set of principles and recommendations on ethics and AI for department leaders to consider.

The Next Chapter in Government Innovation

So, what comes next in the governments innovation arc? As emerging technologies become more normalized within government environments, we expect to see agency leaders fundamentally shift their thinking about the role of technology in their agencies mission success, just as we have seen occur in the private sector.

To illustrate this point, think of the most innovation-driven companies. Most if not all of them fundamentally view themselves as technology companies first, regardless of what business they are in. Amazon.com, for example, views itself as an information technology company that employs digital capabilities to produce a tremendous user experience and highly synchronized logistics operation to support a large-scale retail business. Numerous other companies, including Federal Express, Disney and Capital One, to name just a few, invested in more robust digital capabilities so they could reinvent their own businesses and service-delivery models for a new era.

Likewise, government needs to do this as well. But this can only happen when policy makers and agency leaders begin to view IT as a foundational investment in their future mission performance rather than a line item business expense. Once this happens, this critical shift in mindset will set off a chain of new thinking among government employees and the public alike. As government begins experimenting and seeing successes as they invent new approaches to deliver services, conduct their business, and meet their missions, we will see the culture shift as well.

Technologies will constantly advance, so agencies need a different mindset that views innovation as a non-stop journey of continuous evolution and adaptation.

For example, as modernization efforts advance, technology will remove much of the repetitive, manual work now done by government employees, freeing them up to take on higher-impact work where they will be more empowered with access to data and modern tools so they can contribute more independently and directly to their agencies performance. However, to sustain this, government workplaces will need to build cultures that prioritize continuous, dynamic reskilling and training to keep pace with the velocity of change.

If managed well, we should see government begin to prioritize continuous innovation and use private sector best practices to produce a steady stream of future-focused pilot projects that can be iterated further and eventually scaled to wider applications. This can involve creating sandboxes and field labs that run innovation trials and pilot new technologies and approaches.

These efforts will lead to AI and data analytics becoming embedded in day-to-day enterprise operations and applied to massive amounts of structured and unstructured data. This will enable federal agencies to dramatically reduce the latency of information, produce greater insights, and better inform their policy and decision-making.

Eventually, government will move more aggressively beyond project-focused innovation efforts and transition to more scalable delivery frameworks and ecosystems. This means adopting open, standards-based technology platforms that enable agencies to be more future-ready, flexible, and capable of scaling innovation to far larger use cases. This is especially critical in an age where government agencies and the mission solutions they employ operate increasingly within a broader ecosystem of other government agencies, nonprofits, academia, start-ups, digital-savvy companies, and crowd-sourcing platforms.

Regardless of where the governments modernization journey leads from here, the important point is that the federal government is now poised to embrace the emerging technologies and innovative thought driving much of todays advances and make dramatic leap-frog advances as a result.

Continue reading here:
The Pandemic Is Turbo-charging Government Innovation: Will It Stick? - Knowledge@Wharton

31 commercial open-source software startups that will thrive in 2020 – Business Insider

The economic downturn sparked by the coronavirus pandemic may be hitting many industries hard, but investors say that it may turn out to be a positive opportunity for commercial open source startups.

These types of startups build and maintain open source software that is free for anyone to use, download, or modify. While the software itself is free, these kinds of startups traditionally build a business on top of it by selling customer support or custom-built services for the enterprise.

Investors say they're still betting on open source as the rest of the tech industry encounters turbulence because several of the best tech companies, like Airbnb or Dropbox, have historically been founded during economic downturns including many of those like Canonical that build popular open source software.

Indeed, OSS Capital founder and CEO Joseph Jacks says that he expects adoption of open source software to accelerate during this period. It's often cheaper than proprietary software from the likes of Oracle, SAP, or Microsoft, for starters, which makes it an appealing alternative for companies cutting their IT budgets.

"Open source is much cheaper than proprietary software," Eric Anderson, Scale Venture Partners, told Business Insider. "Enterprises will look at alternatives to some of the proprietary software they're paying for."

There are some structural advantages for these startups, too. Open source software is generally developed by distributed teams of volunteers all over the world. This means that the startups that emerge to support open source software is usually similarly distributed, and more ready to work remotely. That's an edge in a time of global stay-at-home orders, Jacks says.

"Those companies, in fact, are going to do really well in a society and culture and environment where everyone is forced to be as efficient and as productive as possible using the internet and remote tools," he said.

Jacks says the current situation has only reinforced his belief in investing in these types of companies because of how the world has had to adapt.

"It's definitely not changed anything about the way that we get excited in enterprise," he said. "It only made things more concrete."

What follows is a compelling list of commercial open-source software startups set for success in 2020, according to 12 VCs. They range from fledgling companies with hardly any funding to well-established ones that could very well be heading for a huge exit. Funding and valuation information are estimates taken from Pitchbook, the keeper of such data.

See the rest here:
31 commercial open-source software startups that will thrive in 2020 - Business Insider

Linux Foundation will host the Trust over IP Foundation – TechRepublic

New open source standards are coming that can help technologies such as edge computing and IoT achieve greater security.

Image: Jack Wallen

On May 5, 2020, the Linux Foundation plans to announce they will be hosting the Trust over IP Foundation (ToIP), which is a project with one goalenable the trustworthy exchange and verification of data between any two parties on the internet.

This is long overdue.

Effectively, what the ToIP Foundation will do is provide a common standard to ensure data is coming from a trusted source. This standard is being developed with global pan-industry support. Founding members include Cloudocracy, Continuum Loop, CULedger, esatus, Evernym, The Human Colossus Foundation, IBM Security, IdRamp, kiva.org, Lumedic, Mastercard, MITRE, Province of British Columbia, and SICPA. Contributing members include DIDx, GLEIF, iRespond, Marist College, Northern Block, R3,Secours.io, TNO, and the University of Arkansas.

SEE: How smart tech is transforming the transportation industry (TechRepublic Premium)

Technology has evolved to such a state that the transmission of data comes in many forms and from many sources. No longer is user data only transmitted via the traditional network connectionfrom client-to-server, or client-to-client. The new world order includes IoT, hybrid clouds, artificial intelligence, and edge computing. The complications inherent in these technologies makes it even more crucial that universal security and privacy protocols are developed and put into place.

Given that technology already faces a seemingly insurmountable security hurdle, the idea of a universal standard will be a welcome change for both the industries driving these technologies and the consumers whose data is being used and monetized. At the moment, consumer confidence in data transmission security is at an all-time low. This trend will continue, until drastic change happens.

Dan Gisolfi, CTO Decentralized Identity, IBM Security, understands how important these new protocols are. "In today's digital economy, businesses and consumers need a way to be certain that data being exchanged has been sent by the rightful owner and that it will be accepted as truth by the intended recipient. Many privacy focused innovations are now being developed to solve this challenge, but there is no 'recipe book' for the exchange of trusted data across multiple vendor solutions. The new Trust over IP Foundation marks an evolutionary step which goes beyond standards, specs, and code, with the goal of creating a community-driven playbook for establishing 'ecosystems of trust.' IBM believes that the next wave of innovation in identity access management will be for credential issuers and verifiers to partake in these ecosystems, where trusted relationships are built upon cryptographic proofs."

Although there isn't too much information on the nuts and bolts of the new universal standards, it has been reported that the ToIP Foundation will use digital identity models that make use of digital wallets and credentials, as well as the W3C Verifiable Credentials standard to address the challenges facing the issue. By working with these technologies, the ToIP foundation will enable consumers, businesses, and governments to better manage risk, improve digital trust, and protect all forms of online identity.

To this, Jim Zemlin, executive director at the Linux Foundation said, "The ToIP Foundation has the promise to provide the digital trust layer that was missing in the original design of the internet and to trigger a new era of human possibility. The combination of open standards and protocols, pan-industry collaboration and our neutral governance structure will support this new category of digital identity and verifiable data exchange."

The devices that transmit such data currently exist without a standard. Because a vast majority of those devices transmitting these data packets are powered by open source solutions, it only makes sense that any standard to be developed for such technology would be open.

Why? That's a fairly simple question to answer. First off, the technologies that will be using these protocols are almost all founded on open source software. Should those protocols be closed, developers would have a hard time implementing them. The second reason is such protocols need to be available for global vetting. With thousands (or hundreds of thousands) of developers pouring over the code, those protocols will wind up more robust and secure.

Finally, making these protocols open source guarantees that no single entity can control how the protocols are developed or used. This should go a long way to prevent any one large company, with a vested interest in a singular bottom line, from preventing smaller companies from using the protocols with their software.

The ToIP foundation will initially host four Working Groups:

Technical Working Stack Group

Governance Stack Working Group

Utility Foundry Working Group

Ecosystem Foundry Working Group

The Technical Working Stack Group and the Governance Stack Working Group are charged with building out and hardening the tech and governing side of the stack, while the Utility Foundry Working Group and Ecosystem Foundry Working Group will serve as communities of practice for any project looking to collaborate with any part of the ToIP ecosystem.

The ToIP Foundation will host a digital launch event at 9AM PT on May 7, 2020. This event will feature a panel discussion, interoperability demonstration, and a Q&A. To register for the event, visit the official Zoom registration page. A second event will be held for the APAC region.

You don't want to miss our tips, tutorials, and commentary on the Linux OS and open source applications. Delivered Tuesdays

Read more:
Linux Foundation will host the Trust over IP Foundation - TechRepublic

American tech goliaths decide innovation is the answer to Chinese 5G dominance, not bans, national security theater – The Register

Some of Americas super-corps have remembered how the US became the dominant global technology force it is, and have vowed to use innovation over threats to counter Chinese dominance in 5G markets.

Thirty-one corporations ranging from AT&T to VMWare, and including Facebook, Google, IBM, Intel, Microsoft and Oracle, have launched what they are calling the Open RAN Policy Coalition with a goal to develop new, open, and interoperable approaches to the 5G network.

RAN stands for Radio Access Network and is the technology that connects individual devices like smartphones to other parts of the network. It comprises three main components: the radio unit (RU), distributed unit (DU) and centralized unit (CU). Each part is based in different locations and typically uses different hardware and software. Despite the enormous global use of such technology however, most of it remains proprietary, developed and controlled by individual companies.

The Open RAN Policy Coalition proposes opening this entire market up to make all aspects interoperable: a move that it hopes will allow new companies to enter the market and shake things up before next-generation 5G becomes ubiquitous.

While not all companies in the coalition are American it also includes Fujitsu, Telefnica and Vodafone, for example it is US-heavy and the implicit drive is that the organization will help prevent Chinese companies from taking over the 5G market.

That dominance has created global political repercussions with the US government going to extraordinary lengths to limit the take-up of equipment of Chinese companies like Huawei. The Trump Administration has banned Huawei, and others, from the US citing national security concerns. It has also exerted significant pressure on European partners, insisting that they also reject Chinese companies in their 5G plans and even threatening to withdraw from intelligence sharing agreements if they dont. Those threats have largely fallen on deaf ears, with both Germany and the UK rejecting the demand.

But behind the threats has also sat an uncomfortable truth: Chinese companies dominate the market in large part because there arent many good alternatives: competing products are limited, more expensive and often inferior. And that lack of competition is in large part down to how the market has traditionally operated, with proprietary equipment creating massive barriers to entry.

The reality is that using cell sites from different companies leads to a drop-off in performance and so network operators tend to go with a single company for a specific geographic area. An interoperable approach where there is no drop-off could radically change how the new 5G networks are put together. As such, the Open RAN project hopes to create innovation, spur competition and expand the supply chain for advanced wireless technologies including 5G.

The Open RAN Policy Coalition is not the only group hoping to open up the market. There is also the Open Radio Access Network Alliance (O-RAN) which was started in 2018 and includes over 100 companies aimed at interoperable interfaces. And the Telecom Infra Project, headed by Vodafone, which has agreed to work with O-RAN. And there is also the Linux Foundations Open Networking Automation 2Platform (ONAP).

This new coalitions role will be to push for solutions that are hardware-neutral and produce open source software standards. And one big focus for it will be to get the US government to focus less on bashing China and Chinese companies and more on the traditional approach of producing the best technology.

The coalition believes that the US Federal Government has an important role to play in facilitating and fostering an open, diverse and secure supply chain for advanced wireless technologies, including 5G, such as by funding research and development, and testing open and interoperable networks and solutions, and incentivizing supply chain diversity, it said in a statement.

Its executive director also had this to say: As evidenced by the current global pandemic, vendor choice and flexibility in next-generation network deployments are necessary from a security and performance standpoint. By promoting policies that standardize and develop open interfaces, we can ensure interoperability and security across different players and potentially lower the barrier to entry for new innovators.

Sponsored: Practical tips for Office 365 tenant-to-tenant migration

Read the original post:
American tech goliaths decide innovation is the answer to Chinese 5G dominance, not bans, national security theater - The Register

Looking for a new IT gig? Here are vacancies around the world for developers, cloud engineers, infosec analysts, Jira admin, and more – The Register

Job Alert This week we've got job openings from all over the globe to tempt you, your friends or your past colleagues back into work, or indeed into new ventures.

Remember, we're doing this to help the world keep working in these tough times. Our job listings are free - free in the financial sense, free in the snark sense. If you've got roles you need filled, send them here.

First up, we've got a rack of jobs at GBG Plc, for positions it wants filled across the world. GBG says it stands as the global technology specialist in fraud, location and identity data intelligence. And it has numerous vacancies to fill:

Next up, we have a role from Linaro in Cambridgeshire. Linaro has driven open source software development on Arm since its foundation in 2010, providing the tools, Linux kernel quality and security needed for a solid foundation to innovate on, apparently.

Next up we've got three roles from Bianor Services, a company with more than 20-year history in building custom software solutions, delivering IT business consultancy, Quality Assurance, and support for the likes of IBM, Duracell, P&G, NATO, and the EU. Bianor has offices in New York, USA, and Sofia, Bulgaria, but offers remote work as well. Here are the jobs...

Our final role for this week is from Consult Red, which claims to be a trusted partner to the digital media industry and beyond, driving innovation and delivering support through the entire product development journey. It says it applies cutting-edge experience in product development, hardware, embedded and cloud technology to help companies in all sectors to deliver connected devices and systems. Here's the job...

That concludes another week or our free job listings. Good luck, stay safe, see you next week.

Go here to read the rest:
Looking for a new IT gig? Here are vacancies around the world for developers, cloud engineers, infosec analysts, Jira admin, and more - The Register

Red Hat’s current and former CEOs in conversation about open source – ITWeb

Paul Cormier, CEO of Red Hat, and Jim Whitethurst, president of IBM.

The Red Hat Summit was supposed to be held in San Francisco this year, but instead took place online.

The open source companys in-person events are grand affairs - the conference centre seems to be stained red as thousands of customers, partners, staff and the media flock to what must be the largest gathering of open source enthusiasts on the planet.

One benefit of holding the conference online is that more people can attend; 38 000 signed up for Red Hat Summit 2020 Virtual Experience, which took place from 28 to 29 April.

It fell to the companys new CEO Paul Cormier, to kick the proceedings off from his home office near Boston, Massachusetts, this week. First, he spoke to Jim Whitehurst, the former CEO, who is now president of IBM and responsible for that companys cloud and cognitive software organisation. He also oversaw the acquisition of Red Hat by IBM for $34 billion last year.

Whitehurst joined the company in 2007, and Cormier asked him what he thought the biggest change had been over the last 12 years. Without a doubt it had been the rise of open source software, Whitehurst said.

Back then, we were trying to convince people that open source could be a viable alternative to traditional software, so that Linux could replace Unix. I got a lot of questions around whether its secure and reliable. If you look at where weve come fromits the default choice now. That change has been amazing.

Red Hat had less than 2 000 people when he joined, and now employs over 15 000. Now, he said, almost every large enterprise isnt just using open source technologies in their stacks, theyre also thinking about how to implement the open source way, such as Agile, or DevOps methodologies.

(Theyre using open source) not just in how they develop software, but how they run their businesses. People have learned that if theyre going to innovate, it requires a different way of working.

Whitehurst said the biggest topic on IBM customers mind is innovation.

How do you innovate? We went from a world where value creation was much more about how you executed or made things of higher quality, or cheaper. And now value creation is more about growth - how you invent new things, or how you create whole new markets that didnt exist before. How do you invent new ways to interact with your customers?

As for the future of Red Hat within IBM, and whether it could continue to operate with any degree of autonomy, Whitehurst said the two companies had shared a vision of hybrid cloud long before Red Hat was acquired. While the larger company can help to scale the adoption of hybrid cloud, it also competes with partners in the open source ecosystem.

Red Hat has to stand alone so that it can work with competitors of IBM to ensure that the platform is neutral and available to anyone.

Its critical to ensure that the industry has a horizontal platform, and IBM supports that but recognises we have to leave Red Hat separate to ensure its success.

The summits are a chance for Red Hatters, as theyre known, to dress up and show their allegiance to the company. Many wear red fedoras, a nod to the companys Linux distribution project. Others wear crimson dresses, or red shoes. Whitehurst usually wears a stylish shirt of the palest pink to the summits, but this year in a perhaps symbolic move he chose a shirt in a shade of blue, a shade not too far from that of the logo of his new company. He did however include a red fedora within eyeshot of his cameras lens.

Read this article:
Red Hat's current and former CEOs in conversation about open source - ITWeb

Jesse Kline on COVID-19: Keeping government secure and saving taxpayer money with open source – National Post

In this era of social distancing, many have turned to videoconferencing as a means of staying in touch with friends, family and colleagues. And although it dragged its feet for quite some time, the House of Commons has now gone virtual, as well. But why is Parliament relying on a foreign company thats selling a piece of software with a raft of known security issues, instead of finding a made-in-Canada solution that would allow us to protect our data and save taxpayer money?

On Tuesday, the full House convened for the first time over Zoom, the videoconferencing software that has become a household name during this pandemic, with its user base exploding from 10 million daily users in December, to 300 million today. Zoom, however, has come under increased scrutiny about its substandard security and lax privacy controls.

The company outright lied about using end-to-end encryption. We learned that it has access to decryption keys, meaning it can potentially snoop on conversations. A team from the University of Toronto found that the software was sometimes sending encryption keys through servers located in communist China, even if none of the participants in the call were from that country. And the term Zoombombing has entered the lexicon, with many meetings being spied on or actively disrupted by people spouting racism and displaying Nazi imagery.

A parliamentary spokesperson told CBC that the version of the software being used by the House has added security features and that most parliamentary proceedings are open to the public anyway, so privacy is less of an issue (cabinet meeting are being held using something else entirely).

Fair enough. But given that the FBI has warned teachers not to use Zoom and many companies such as Daimler, Ericsson, SpaceX and Postmedia and governments including Germany, Taiwan and Singapore have banned its use outright, it seems like Parliament should have had some reservations about it.

Much has been made in recent weeks about future-proofing Canada to withstand future crises by producing more supplies here at home. As Ive written previously, this is problematic because protectionism doesnt ensure we have adequate supplies of a given product and its impossible to predict exactly what we will need to meet the next emergency.

When it comes to software, however, its a different matter entirely, because there is a huge variety of free and open source software packages available that are already powering much of the worlds critical infrastructure and can easily be adapted to Canadas needs.

For the uninitiated, open source refers to software that is developed in the open and given away for free. It is often written by teams that can include many people, from unpaid volunteers, to employees of some of the worlds largest tech firms. Even if youve never heard of open source, chances are that you are running it, or using technology that is based on it.

A majority of websites run on open source. The open source Linux operating system is the basis for Googles Android and Chrome OS systems, and powers a plethora of Internet of Things devices, from routers, to smart TVs, to home automation systems.

Another videoconferencing platform thats seen a sharp increase in popularity is Jitsi. While its run by a company called 88, which offers free and paid plans, its also open source, meaning anyone can run a Jitsi server and anyone with enough knowledge can audit its source code to figure out exactly how it works and whether there are any potential security vulnerabilities.

The advantage of the government selecting open systems, like Jitsi, instead of proprietary ones, like Zoom, is that it would allow government to run all its systems in-house, instead of relying on foreign companies to transmit and store data.

It would also give government the ability to conduct security audits of its systems, which is much easier to do when you can see the code that a software package was built with, rather than trying to figure out how a black box works without being able to open it up.

And while there would be an initial cost to purchasing the necessary hardware and ensuring the government has the proper expertise to implement and maintain it, there would be significant savings for taxpayers in the long run, as the government would be able to stop paying for costly software licenses.

Jitsi is already being used by companies like WeSchool, an Italian firm that runs online classroom software that is being used by 500,000 educators and students during this crisis. And in February, the South Korean government began switching its desktops from Windows 7 to Linux, which it expects will save it significant sums of money in the future.

Security researchers have warned the government that Zoom is a privacy disaster waiting to happen. In order to protect our critical information technology infrastructure, especially that which is tasked with running our democratic institutions, from foreign interference and espionage, we need to seriously look at running these systems in Canada, with software we can trust.

Finding open source solutions is the best way to go about doing that.

National Postjkline@nationalpost.comTwitter.com/accessd

See the article here:
Jesse Kline on COVID-19: Keeping government secure and saving taxpayer money with open source - National Post

Private Internet Access announces third year of WireGuard sponsorship – Privacy News Online

Private Internet Access is happy to announce that we are sponsoring the WireGuard project as a bronze company donor. Private Internet Access first sponsored WireGuard in 2018, and with our 2020 sponsorship has now been a WireGuard sponsor for three years. Private Internet Access believes in sponsorship as a way of giving back to the community and is proud to sponsor WireGuard.

This years WireGuard sponsorship is special because it coincides with our release of WireGuard support on all desktop clients and mobile applications. Private Internet Access first released WireGuard support as part of a beta in March, 2020 and brought it to all users in April, 2020. Similarly, CyberGhost and ZenMate, other VPN companies under the same parent company, KAPE, have also supported WireGuard as bronze company donors.

Private Internet Access is committed to providing a secure and private no logging VPN service to our customers and as part of that commitment, we sponsor open source projects and organizations that champion security, privacy, and civil liberties. WireGuard is a new VPN protocol that has been widely acclaimed so much so that it is now part of the Linux kernel. Learn more about WireGuard in the PIA WireGuide.

Because it is a free and open source software (FOSS) project, WireGuard development is supported by developers that donate their time, as well as companies that donate funds. FOSS underlies much of the internet and computing technologies that we use today and Private Internet Access will continue to support FOSS projects like WireGuard because that is part of the PIA ethos. Without the support of the entire community, projects like WireGuard would not be able to exist to advance our internet. View a full list of organizations and projects sponsored by Private Internet Access on this page.

WireGuard is a registered trademark of Jason A. Donenfeld.

Read the rest here:
Private Internet Access announces third year of WireGuard sponsorship - Privacy News Online

IOTech: Bridging the OT-IT Divide – EE Journal

Ive just been talking to the folks from a jolly interesting company called IOTech. One of the things they told me that really struck a chord was that their technology bridges the OT-IT divide. But what is the OT-IT divide, and why does it need to be bridged? I hear you cry. Well, Im glad you asked, because I feel the urge to expound, explicate, and elucidate (dont worry; Im a professional).

As an aside, IOTech started in Newcastle upon Tyne, which is a university city on the River Tyne in northeast England (IOTech is now headquartered in Edinburgh, with sales and marketing throughout Europe and America). With its twin city, Gateshead, Newcastle used to be a major shipbuilding and manufacturing hub during the Industrial Revolution; its now transmogrified itself into a center of business, arts, and sciences. The reason I mention this here is that I almost went to university in Newcastle I went up there for the interviews and thought it was a fantastic institution in a gorgeous city but I ended up getting a better offer from Sheffield Hallam University (that is, Sheffield said theyd accept me. LOL).

As another aside, the term Geordie is both a nickname for a person from the Tyneside area of North East England and the dialect used by its inhabitants. All the Geordies Ive ever met have had a wonderful sense of humor, and the Geordie dialect has a sing-song quality thats very pleasing to the ear, but I fear we are in danger of wandering off into the weeds

Now, this next part isnt an aside (I know youre surprised), but rather it is laying a foundation for what is to come. Have you noticed how everybody seems to be talking about the edge at the moment? It seems you can barely start to read an IoT-centric article without hearing tell of things like edge analytics, edge computing, and edge security. Another area of confusion is when people talk about the Internet of Things (IoT) and the Industrial IoT (IIoT). What exactly do we mean by these terms, and how do they relate to the edge?

The problem is that everyone has different understandings as to what the term edge actually means. In fact, we discussed a lot of this in my column What the FAQ is the Edge vs. the Far Edge? As we noted in that column, part of this depends on who you are and your function in the world. If you work for a cloud service provider, for example, then the people who connect to and use your services may live thousands of miles away, so you might regard almost anything outside of your facility, including internet service providers (ISPs), as being the edge.

By comparison, if you are an ISP, then you will regard your customers in the form of homes, stores, factories, etc. as being the edge. Even when you get to a factory, for example, different people will have different views as to what constitutes the edge.

The term edge means different things to different people (Image source; Max Maxfield)

At some stage, we may want to talk about the IoT and IIoT devices that are located at the very edge of the internet the ones containing the sensors and actuators that interface to the real world. In this case, some people use terms like the far edge and extreme edge as an aid to visualizing the relative location of these devices in the scheme of things.

But wait, theres more (I can barely believe it myself). The term information technology (IT) refers to the use of computers and networks to store, retrieve, transmit, and manipulate data or information. In the case of an industrial environment like a factory, all of this is overseen by the companys IT department. I know this is unfair, but I cant help but visualize these folks as relaxing in a luxurious air-conditioned common room sipping cups of designer coffee while occasionally deigning to field questions from their users by superciliously saying things like, Are you sure youve turned it on? You have? Well, in that case turn it off, then turn it on again, and then call back if its still not working! (No, of course Im not bitter; why do you ask?)

But what about the heroes working in the trenches those charged with the monitoring and control of physical devices, such as motors, generators, valves, and pumps? These brave guys and gals are the cream of the operational technology (OT) department.

All of which leads us to the terms thin (OT) edge and thick (IT) edge. The idea here is that the thin edge refers to the domain of the OT group working with automation-centric data, while the thick edge refers to the realm of the IT group who want to take that data and run with it, but who cannot access or use it in its raw form. Thus, one of the primary roles of the OT group is to convert the raw data into a form that can be consumed by the IT department. (Im reminded of the phrase the thin blue line, which refers figuratively to the position of police in society as the force that holds back chaos. Similarly, we can think of the OT folks at the thin edge as holding back the chaos of the real world while transforming it into something the rest of us can use.)

The problem is that, assuming they deign to talk to each other at all, the OT and IT teams speak different languages. What is required is some way to bridge the divide between these groups, which brings us back to the folks at IOTech whose technology does just that (its like dj vu all over again; hang on, didnt somebody just say that?).

Lets start with EdgeX Foundry, which is a vendor-neutral open-source platform hosted by the Linux Foundation that provides a common framework for IIoT edge computing. The goal of the EdgeX Foundry is the simplification and standardization of edge computing architectures applicable in IIoT scenarios.

In this context, the term South Side refers to a heterogeneous set of devices, sensors, actuators, and other IoT objects that produce the raw data. Contra wise, the term North Side refers to the fog and/or the cloud where the data will eventually be aggregated, stored, and analyzed. The role of the EdgeX Foundry is to take the raw data from the South Side (the domain of the OT group), and treat and process it into a form suitable to be handed over to the North Side (the realm of the IT department).

The term microservices refers to a software development technique a variant of the service-oriented architecture structural style that arranges an application as a collection of loosely coupled services. In a microservices architecture, services are fine-grained, and the protocols are lightweight. Thereason I mention this here is that the EdgeX Foundry platform is structured in different layers, each one composed of multiple microservices. This modular architecture allows users to easily scale, update, and distribute the logic into different systems, while also improving maintainability.

Having said all this, companies typically dont directly deploy open source software for business-critical, safety-critical, or mission-critical applications. Instead, they go to someone that will offer enterprise-level services and support. Consider the relationship between the Linux operating system (OS) and Red Hat, for example. Linux is open source, so companies could theoretically simply load it onto all of their workstations and servers themselves for free. Instead, they prefer to go to a company like Red Hat, which offers enterprise-level service and support. (Founded in 1993, Red Hat was acquired by IBM in 2019).

The way I think of this is that the Arduino is open source, and the folks at Arduino provide all of the hardware and software files I need to build my own boards, but even so I still find it quicker, easier, and more reliable to buy fully functional Arduinos from trusted vendors.

Based on this, we might think of IOTech as being the Red Hat of the EdgeX Foundry world. In addition to providing commercial support for trusted deployment of the open source baseline EdgeX Foundry platform, IOTech also offers numerous value-add features that are not available in the baseline version, such as the following:

Lets start with IOTechs edgeXpert. On the South Side, edgeXpert provides a wealth of connectors and profiles that can ingest raw data from sensors in any way, shape, or form anything from the twisted wire electrical specifications and protocols of the 1960s to the packet-based interfaces of todays most sophisticated sensor devices (the connectors handle the protocols, while the profiles specify what to expect coming in and what to send out). On the North Side, there are the enterprise level connectors and profiles required to export the processed data into the fog and the cloud.

The IOTech edge platform solution (Image source: IOTech)

In between, the IOTech edge platform solution includes core data services, such as the ability to normalize edge data, aggregate edge data from multiple sensors, and store edge data for subsequent use. There are also services to analyze and process the data, along with services for security and management.

One key point is that edgeXpert boasts a distributable architecture its various microservices can run on a single host of be distributed across a cluster based on resource availability. In some cases, portions of edgeXpert might run on the edge devices themselves, such as smart surveillance cameras, for example. The containerized deployment of microservices supports portability, while distribution of microservices provides scalability and failover support, where failover is a method of protecting computer systems from failure, in which standby equipment automatically takes over when the main system fails.

edgeXpert boasts a distributable architecture (Image source: IOTech)

In the same way that you dont want the airbag in your car deploying based on a decision thats made in the cloud, there are some situations in factories when a decision needs to be made and acted on quickly. Thus, edgeXpert can also be used to provide rules-based control functions along the lines of If this temperature exceeds this value, then turn the machine off (more sophisticate control systems from other vendors can be integrated into the framework).

The good news is that edgeXpert is ideal for situations that can be addressed with near real-time responsiveness, which equates to around 80% of use cases. The better news is that, for those time-critical systems that require ultra-low latency response times and support for real-time deterministic data processing, the edgeXrt extension to edgeXpert can be brought into play.

I get to talk to a lot of companies that are working on super cool technologies like artificial intelligence (AI) and machine learning (ML). Very often, the folks from these companies drop terms like the AIoT into the conversation. As I noted in my column What the FAQ are the IoT, IIoT, IoHT, and AIoT?:

According to the IoT Agenda, The Artificial Intelligence of Things (AIoT) is the combination of artificial intelligence (AI) technologies with the Internet of Things (IoT) infrastructure to achieve more efficient IoT operations, improve human-machine interactions, and enhance data management and analytics [] the AIoT is transformational and mutually beneficial for both types of technology as AI adds value to IoT through machine learning capabilities and IoT adds value to AI through connectivity, signaling, and data exchange.

I couldnt have said it better myself. I remember many occasions where Ive been lured, beguiled, and tempted by promises of delight with regard to all the wonderful tasks these AI systems could perform in an industrial setting. For example, Ive heard tales of how an AI system running on the edge (whatever that is) or in the fog or in the cloud can monitor the vibration of a machine using a 3-axis accelerometer, or listen to it using a MEMS microphone, and detect subtle anomalies and patterns and use these to predict timelines for future problems and to automatically schedule preemptive maintenance.

The one thing I dont recall any of these companies talking about is how they take the noisy (from electrical interference) and dirty (with missing or false values) raw data from myriad diverse sensors and wrangle it into a form thats suitable for their systems to peruse and ponder. Of course, Ive now discovered that IOTech provides this oh so important piece of the jigsaw, thereby bridging the OT-IT divide.

Related

Continue reading here:
IOTech: Bridging the OT-IT Divide - EE Journal