UPDATED Move over Edge: Safari looks to be jumping on the Chromium bandwagon, too – Chrome Unboxed

UPDATE: According to some traffic over on Twitter, it looks like this whole thing is vaporware. It has been uncovered that the bug in question in the article below has nothing to do with Safari at all. The bug is still private and cannot be viewed by non-Google employees, but it appears that this bug is assigned to Move sync tests to staging from 2015. Were sorry for the incorrect reporting and have left the original story below. Additionally, here is a tweet from someone more informed than me on the subject:

This is completely fake. No such plan. The supposed email address isnt anyone on the Safari/WebKit teams, there is no ITP code in Chromium that could be enabled, and the screenshot is not a real Safari design.

A day after Christmas, a reader delivered a tasty little present to our collective inbox and the implications are pretty big. If the screenshots in this email/article are to be believed, it looks like Apple may be transitioning the Safari web browser over to Chromium in a move similar to what Microsoft has done recently with Edge. It is shocking, honestly, to consider Apple bending this way, but it makes a lot of sense in the long run.

Before we can make any sense of this, we have to get the whole picture. Apple has famously been very closed in its software efforts over the years. From iOS to MacOS, they have never really felt the need to associate themselves with open source material in general. Apple builds the hardware, maintains the OS and software, and if you want to sell your apps on their platforms, you just have to suck it up and play ball.

This all works fine when you sell iPhones like hotcakes. When you so heavily dominate a given market (the US) with your hardware, you get to make whatever rules you want and developers and consumers alike get whatever it is you feed them. But this doesnt work with the web. The web is the ultimate open platform and delivery tool. Thank God weve not all caved on that expectation. Somehow in the midst of this open-minded nature, no one has ever been OK with being forced to use one browser over another. People dont care what browser their computer of choice ships with: they want to use the browser they enjoy using that just works.

For the vast majority of users, Chrome became that browser right in the middle of companies like Apple and Microsoft choosing to lag behind on rapidly-changing web standards. Whether it was due to stubbornness or incompetency, Safari, Internet Explorer, and Edge were constantly lacking in multiple facets when compared with Chrome. As those browsers stayed consistently behind the curve, Chrome only grew in popularity to where it is in its dominant place now. At any point, Apple or Microsoft could have simply decided that 3rd party browsers were not allowed, but theres just something about the web that wouldnt allow that to happen. Theres no way users would have it.

One of the primary benefits of Chrome is the fact that it is based on the Chromium browser. This is the open source foundation of Chrome the way you know it. Chromium and Chromium OS are both developed out in the open and are contributed to by anyone interested in their success. Chrome (Googles version of the Chromium browser) has a handful of proprietary bits and pieces that are Googles and those things make it unique to other browsers that run on Chromium, but because it is built on an open source base, Chrome gets to take full advantage of not just a single team of developers, but an entire horde.

Other browsers have seen the vast opportunity and done the same. Browsers like Brave, Opera, Vivaldi, Edge, Samsungs Internet Browser, and a slew of others all run on this same Chromium foundation. What this means is the teams charged with developing each of these expressions of Chromium care about what Chromium can do and what it cant and are all working towards a more functional, better-performing web browser core. Sure, they all serve their own interests along the way, but the work they do for their platform is there for others to leverage as well.

This is the power of open source software. Instead of a ton of companies competing to build differing standards and ways of doing things, they all spend that same time working on a common project that will be so much better as a result. Microsofts work on Chromium for Edge has already began yielding better RAM management and battery life. These are things important to Microsoft in their browser, so it will become something that is also important for Chromium. In this system, consumers win every time.

Shop Chromebooks on Amazon

Now, we get to this new report. As you may assume, Safari is both behind the curve and closed source, so a move to a Chromium-based browser would be both beneficial to Apple and users alike. Apple gets to deliver Safari without all the technical deficiencies it comes with and users get to just use Safari out of the box without having to decide which browser to go with when they buy their new Mac. If Safari is already installed and just as good from a technical standpoint as Chrome, why bother installing something else?

If this report from iphones.ru is to be believed, theres a good chance Safari may end up built on Chromium sooner rather than later. In the article, a bug report (that has since been hidden) was found that is requesting Intelligent Tracking Prevention to be activated in Chrome 80. This is a big deal since ITP is a proprietary to Safari at this point and the request is coming from an Apple employee. Why would an Apple developer be requesting a proprietary Safari feature be activated in Chrome 80? Well, you know that answer to that, dont you?

In addition to this evidence, there are a couple screen grabs from the bug report. The first one clearly shows this request coming from a developer with an @apple.com address along with an included screenshot, and upon quick inspection, the screenshot is of Safari running on Chromium in an alpha state.

A couple more notes, here. First, the bug has since been marked as private. If this was a fake, searching the bug number would come up with an error. Instead, it comes up as blocked and needing permission to view. Theres no real reason for anyone to hide this if it wasnt legit. They could have simply replied in the bug thread and told this guy to stop messing around. Instead, they hid the references, and this tells me that something was uncovered that shouldnt have been.

Second, the alpha version of the Chromium-based Safari being pictured is just that: alpha. It looks pretty rough and that is honestly to be expected. Ripping out the guts of your browser makes you only focus on the function at first. Apple is likely doing a ton of work under the hood to make sure all the functional parts are lining up and working before they even consider making the outside look nice, so dont read too much into the overall hideous appearance in that screenshot.

Finally, it is worth noting that in the first screenshot, you can see the list of operating systems this is being added to: Windows, Linux, and Mac. This could mean that well see a Chromium-based Safari on all three major desktop operating systems and could signal yet another softening of the walled garden approach Apple has long been known for. Theyre relatively recent inclusion of PWAs on iPhone is further evidence that Apple may finally be seeing the light with broader, open source, web-based software, and thats good news for everyone.

See the rest here:
UPDATED Move over Edge: Safari looks to be jumping on the Chromium bandwagon, too - Chrome Unboxed

Suse Marks Nine Years Of Continuous Growth With Successful FY2019 – E3zine.com

Suse heralded its ninth consecutive year of revenue growth, announcing financial results and highlights from its fiscal year 2019 ended Oct. 31.

Having become the worlds largest independent open source company earlier this year, Suse saw its application delivery subscription revenue jump 299 percent year over year. In addition, cloud revenue increased 64 percent, driven by cloud providers like Amazon Web Services, Google Cloud and Microsoft Azure, as Suses Cloud Service Provider ecosystem grew exponentially.

Customer deals valued at $1 million or more increased 13 percent, contributing to a double-digit jump in revenue. As growth accelerated, Suses employee base grew 11 percent.

Suses success comes from our commitment to customers, open source software and innovation that is relevant to the market and customers today, said Melissa Di Donato, CEO of Suse. As the worlds largest independent open source company, we are in a position to serve customer needs, accelerate our growth and move at a speed like never before. Suse will continue to try to be the most customer-centric open source company, to meet customers where they are and help them get to what success looks like to them. That commitment shows in the numbers from this year and years past, and even more impressively: it shows in the success of our customers and partners.

Suses commitment to customers and technology that matters can be seen in the companys increased focus and investment in strategic software solutions that enable customers to create, deploy and manage workloads anywhere on premises, hybrid and multi-cloud as they embrace digital transformation to more effectively compete in todays markets.

In addition to increased focus on the application delivery market, Suse is maintaining and growing its commitment to delivering the best enterprise Linux in the industry along with best-in-class software-defined storage based on the Ceph open source project. Suse always works to provide innovative technology and services that best serve the needs of customers and partners.

Suse continues to focus on growth and expansion, and the companys commercial growth is and will continue to be driven both organically and through potential acquisitions.

View post:
Suse Marks Nine Years Of Continuous Growth With Successful FY2019 - E3zine.com

Mulesoft development services – helping businesses with provision of customized solutions – The African Exponent

Business organizations have to continuously adapt themselves with changing times. In order to stay ahead of the competition prevailing in the market and to efficiently tackle the problems an organization may face with greater developments, one must adopt new digital systems of working. The workflow within an organization depends upon communication between various departments and completion of processes. This requires complete interconnectivity between various applications, departments, processes and people who are connected to q business enterprise like customers, suppliers, retailers etc. Integration of business processes and applications require adoption of an cloud based system of storing and sharing information. Every process stays connected with other operations which enables faster completion of assigned tasks.

Mulesoft is one such company which provides required technical consultancy and software development services to business enterprises. Mulesoft software development services help an organization by developing, creating and integrating new digital cloud based of work within normal workflow of the business. They help in development if necessary automated integration systems which can help in streamlining of workflow within an organization. The company help in creation of necessary customized digital software like cloud based technology and application program interface (API) which can help in providing security and strength to an organizational structure. Moreover, their consultation services help an organization fully understand the functions and thereby its uses for building a better organization.

Mulesoft consultation and software development services help in development and deployment of best digital solutions which helps in eradication of all obstacles an organization may face while improving and developing itself.

The company provides following services:

Modernization of business enterprise:

Mulesoft help in providing necessary solutions and systems which can help in improving overall productivity and effectiveness of a business enterprise. They combine various open source software solutions and their personal expertise to provide necessary know and skill to IT department of an organization in order to make them capable of understanding the new system. Mulesoft Consulting can help in modernization of IT infrastructure, migrate to a flexible cloud based system, implement a storage solution and accelerate business performance.

Building better applications:

Mulesoft development services help an organization to build better infrastructure and applications which can help in better adoption of new digital and cloud based systems. Mulesoft helps business enterprise in building better applications. Its amazing consultation and development services provide the necessary effectiveness to an organization to build better applications at a faster pace. This even helps better marketing of created applications. They help in integration of various processes, applications and data for better transmission of information and improved performance.

Optimization of operations:

Mulesoft improves the way various operations are carried out in the business. Operations are automated, infrastructure is improved, IT is streamlined and efficiency with work is performed is improved. This helps in faster delivery of applications.

Thus Mulesoft developments help an organization at each and every step of transition from a traditional to a more modern system of workflow. Every organization wishing to improve its level of operations must consider adoption Mulesoft development and consultation services which are termed to be the best in the world owing to their seamless workflow and help groups.

Read more:
Mulesoft development services - helping businesses with provision of customized solutions - The African Exponent

gInk is an on-screen annotation software for Windows – Ghacks Technology News

On-screen annotation software is useful in a number of situations including during presentations or demonstrations. The main idea behind the open source application glnk is to provide Windows users with an easy to use yet powerful program to make on-screen annotations with ease.

Windows users may download the latest version of the program from the project's GitHub website. Those interested in the source code find it hosted there as well.

All it takes is to download the latest version of the software, extract the archive it comes in, and run the executable from the destination directory.

The on-screen annotation software sits idly in the background on start. You may launch it either with a left-click on the system tray icon or use the global hotkey Ctrl-Alt-G instead. The toolbar is displayed at the bottom and most on-screen activity is blocked at the same time.

Use hotkeys, the mouse or touch-input to select one of the available tools to start using it. Several pencils are provided to draw on the screen; there is also an eraser, an undo function, and a trashbin to destroy everything that has been annotated up to that point. The arrow icon does not paint arrows on the screen but is used to activate mouse functionality (to activate links or buttons). A click on the camera icon creates a snapshot of the screen.

The application supports mouse, pen, and touch input. Pen users may notice that it can distinguish between different pen pressures. Another useful feature is that glnk supports multi-display devices as well.

The options of the open source software provide additional settings. You may select the drawing tools that you want displayed when you invoke the toolbar. All but the pen width panel are displayed by default and all but the pencil selection options may be removed from the toolbar.

Other options provided include the ability to drag the toolbar around on the screen, to define up to ten pens each with its distinct color, alpha and width, and to set up or edit hotkeys (for each of the pens and tools).

Tip: check out ScreenMarker which provides similar functionality.

gInk is a well-designed screen annotation software for Windows. It is portable and open source, and supports most tools and features that one would expect from a program of its kind. I'd like to see options to place some elements on the screen as well as text. While you can create those using the pens, it would make things easier if these would be provided by default.

Now You: have you used screen annotation programs in the past?

Author Rating

Software Name

gInk

Operating System

Windows

Software Category

Productivity

Landing Page

Read the rest here:
gInk is an on-screen annotation software for Windows - Ghacks Technology News

Code Analysis and Happy Holidays – Enterprise License Optimization Blog

December 19, 2019 Kendra Morton

Its been a great year at Flexera, and Im hoping my readers, too, prospered and experienced their own versions of success in 2019. Ive enjoyed the time Ive spent on my blog, delivering my views to all valued members of the open source community. Software Composition Analysis (SCA) is thriving; yes at Flexera, but also as a technology that is impacting companies across the globe and how they manage open source software, provide transparency across teams, and enable more innovation because license, IP and security risk protocols are in place. Theres peace of mind.

And thats what everyone wants as 2019 rolls into a new year.

2020 is bound to be another year that brings unprecedented stories and trends related to code analysis. Trends like:

Im looking forward to 2020 while stopping to reflect on the past year. Its my greatest pleasure to wish you happy holidays and a successful 2020.

Id like to hear from you.

Whats your trend meter say about open source technologies in 2020?

What are you the most excited about?

Related

Tags: Open Source Compliance, Open Source Security, Open Source Software (OSS)

Read more:
Code Analysis and Happy Holidays - Enterprise License Optimization Blog

Big Data Predictions: What 2020 Will Bring – Datanami

(ju_see/Shutterstock)

With just over a week left on the 2019 calendar, its now time for predictions. Well run several stories featuring the 2020 predictions of industry experts and observers in the field. It all starts today with what is arguably the most critical aspect of the big data question: The data itself.

Theres no denying that Hadoop had a rough year in 2019. But is it completely dead? Haoyuan HY Li, the founder and CTO of Alluxio, says that Hadoop storage, in the form of the Hadoop Distributed File System (HDFS) is dead, but Hadoop compute, in the form of Apache Spark, lives strong.

There is a lot of talk about Hadoop being dead, Li says. But the Hadoop ecosystem has rising stars. Compute frameworks like Spark and Presto extract more value from data and have been adopted into the broader compute ecosystem. Hadoop storage (HDFS) is dead because of its complexity and cost and because compute fundamentally cannot scale elastically if it stays tied to HDFS. For real-time insights, users need immediate and elastic compute capacity thats available in the cloud. Data in HDFS will move to the most optimal and cost-efficient system, be it cloud storage or on-prem object storage. HDFS will die but Hadoop compute will live on and live strong.

As HDFS data lake deployments slow, Cloudian is ready to swoop in and capture the data into its object store, says Jon Toor, CMO of Cloudian.

In 2020, we will see a growing number of organizations capitalizing on object storage to create structured/tagged data from unstructured data, allowing metadata to be used to make sense of the tsunami of data generated by AI and ML workloads, Toor writes.

The end of one thing, like Hadoop, will give rise the beginning of another, according to ThoughtSpot CEO Sudheesh Nair.

(Swill Klitch/Shutterstock)

Over the last 10 years or so, weve seen the rise, plateau, and the beginning of the end for Hadoop, Nair says. This isnt because Big Data is dead. Its exactly the opposite. Every organization in the world is becoming a Big Data company. Its a requirement to operate in todays business landscape. Data has become so voluminous, and the need for agility with this data so great, however, that organizations are either building their own data lakes or warehouses, or going directly to the cloud. As that trend accelerates in 2020, well see Hadoop continue to decline.

When data gets big enough, it exerts a gravitational-like force, which makes it difficult to move, while also serving to attract even more data. Understanding data gravity will help organizations overcome barriers to digital transformation, says Chris Sharp, CTO of Digital Realty.

Data is being generated at a rate that many enterprises cant keep up with, Sharp says. Adding to this complexity, enterprises are dealing with data both useful and not useful from multiple locations that is hard to move and utilize effectively. This presents enterprises with a data gravity problem that will prevent digital transformation initiatives from moving forward. In 2020, well see enterprises tackle data gravity by bringing their applications closer to data sources rather than transporting resources to a central location. By localizing data traffic, analytics and management, enterprises will more effectively control their data and scale digital business.

All things being equal, its better to have more data than less of it. But companies can move the needle just by using available technology to make better use of the data they already have, argues Beaumont Vance, the director of AI, data science, and emerging technology at TD Ameritrade.

As companies are creating new data pools and are discovering better techniques to understand findings, we will see the true value of AI delivered like never before, Vance says. At this point, companies are using less than 20% of all internal data, but through new AI capabilities, the remaining 80% of untapped data will be usable and easier to understand. Previous questions which were unanswerable will have obvious findings to help drive massive change across industries and societies.

Big data is tough to manage. What if you could do AI with small data? You can, according to Arka Dhar, the CEO of Zinier.

Going forward, well no longer require massive big data sets to train AI algorithms, Dhar says. In the past, data scientists have always needed large amounts of data to perform accurate inferences with AI models. Advances in AI are allowing us to achieve similar results with far less data.

(Drendan/Shutterstock)

How you store your data dictates what you can do with it. You can do more with data stored in memory than on disk, and in 2020, well see organizations storing more data on memory-based systems, says Abe Kleinfled, the CEO of GridGain.

In 2020, the adoption of in-memory technologies will continue to soar as digital transformation drives companies toward real-time data analysis and decision-making at massive scale, Kleinfled says. Lets say youre collecting real-time data from sensors on a fleet of airplanes to monitor performance and you want to develop a predictive maintenance capability for individual engines. Now you must compare anomalous readings in the real-time data stream with the historical data for a particular engine stored in the data lake. Currently, the only cost-effective way to do this is with an in-memory data integration hub, based on an in-memory computing platform like Apache Ignite that integrates Apache Spark, Apache Kafka, and data lake stores like Hadoop.2020 promises to be a pivotal year in the adoption of in-memory computing as data integration hubs continue to expand in enterprises.

Big data can make your wildest business dreams come true. Or it can turn into a total nightmare. The choice is yours, say Eric Raab and Kabir Choudry, vice presidents at Information Builders.

Those that have invested in the solutions to manage, analyze, and properly action their data will have a clearer view of their business and the path to success than has ever been available to them, Raab and Choudry write. Those that have not will be left with a mountain of information that they cannot truly understand or responsibly act upon, leaving them to make ill-informed decisions or deal with data paralysis.

Lets face it: Managing big data is hard. That doesnt change in 2020, which will bring a renewed focus on data orchestration, data discovery, data preparation, and model management, says Todd Wright, head of data management and data privacy solutions at SAS.

(a-image/Shutterstock)

According to the World Economic Forum, it is predicted by 2020 that the amount of data we produce will reach a staggering 44 zettabytes, Wright says. The promise of big data never came from simply having more data and from more sources but by being able to develop analytical models to gain better insights on this data. With all the work being done to advance the work of analytics, AI and ML, it is all for not if organizations do not have a data management program in place that can access, integrate, cleanse and govern all this data.

Organizations are filling up NVMe drives as fast as they can to help accelerate the storage and analysis of data, particularly involving IoT. But doing this alone is not enough to ensure success, says Nader Salessi, the CEO and founder of NGD Systems.

NVMe has provided a measure of relief and proven to remove existing storage protocol bottlenecks for platforms churning out terabytes and petabytes of data on a regular basis, Salessi writes. Even though NVMe is substantially faster, it is not fast enough by itself when petabytes of data are required to be analyzed and processed in real time. This is where computational storage comes in and solves the problem of data management and movement.

Data integration has never been easy. With the ongoing data explosion and expansion of AI and ML use cases, it gets even harder. One architectural concept showing promise is the data fabric, according to the folks at Denodo.

Through real-time access to fresh data from structured, semi-structured and unstructured data sets, data fabric will enable organization to focus more on ML and AI in the coming year, the Denodo company says. With the advancement in smart technologies and IoT devices, a dynamic data fabric provides quick, secure and reliable access to vast data through logical data warehouse architecture. Thus, facilitating AI-driven technologies and revolutionizing businesses.

Seeing how disparate data sets are connected using semantic AI and enterprise knowledge graphs (EKG) provide other approaches for tackling the data silo problem, says Saurav Chakravorty, the principal data scientist at Brillio.

An organizations valuable information and knowledge is often spread across multiple documents and data silos, creating big headaches for a business, Chakravorty says. EKG will allow organizations to do away with semantic incoherency in fragmented knowledge landscape. Semantic AI with EKG complement each other and can bring great value overall to enterprise investments in data lake and big data.

2020 holds the potential to be a breakout year for storage-class memory, argues Charles Fan, the CEO and co-founder of MemVerge.

With an increasing demand from data center applications, paired with the increased speed of processing, there will be a huge push towards a memory-centric data center, Fan says. Computing innovations are happening at a rapid pace, with more and more computation techfrom x86 to GPUs to ARM. This will continue to open up new topology between CPU and memory units. While architecture currently tends to be more disaggregated between the computing layer and the storage layer, I believe we are headed towards a memory-centric data center very soon.

We are rapidly moving toward a converged storage and processing architecture for edge deployments, says Bob Moul, CEO of machine data intelligence platform Circonus.

Gartner predicts there will be approximately 20 billion IoT-connected devices by 2020, Moul says. As IoT networks swell and become more advanced, the resources and tools that managed them must do the same. Companies will need to adopt scalable storage solutions to accommodate the explosion of data that promises to outpace current technologys ability to contain, process and provide valuable insights.

Dark data will finally see the light of day in 2020, according to Rob Perry, the vice president of product marketing at ASG Technologies.

(PictureDragon/Shutterstock)

Every organization has islands of data, collected but no longer (or perhaps never) used for business purposes, Perry says. While the cost of storing data has decreased dramatically, the risk premium of storing it has increased dramatically. This dark data could contain personal information that must be disclosed and protected. It could include information subject to Data Subject Access Requests and possible required deletion, but if you dont know its there, you cant meet the requirements of the law. Though, this data could also hold the insight that opens up new opportunities that drive business growth. Keeping it in the dark increases risk and possibly masks opportunity. Organizations will put a new focus on shining the light on their dark data.

Open source databases will have a good year in 2020, predicts Karthik Ranganathan, founder and CTO at Yugabyte.

Open source databases that claimed zero percent of the market ten years ago, now make up more than 7%, Ranganathan says. Its clear that the market is shifting and in 2020, there will be an increase in commitment to true open source. This goes against the recent trend of database and data infrastructure companies abandoning open source licenses for some or all of their core projects. However, as technology rapidly advances it will be in the best interest of database providers to switch to a 100% open source model, since freemium models take a significantly longer period of time for the software to mature to the same level as a true open source offering.

However, 2019 saw a pull back away from pure open source business models from companies like Confluent, Redis, and MongoDB. Instead of open source software, the market will be responsive to open services, says Dhruba Borthakur, the co-founder and CTO of Rockset.

Since the public cloud has completely changed the way software is delivered and monetized, I predict that the time for open sourcing new, disruptive data technologies will be over as of 2020, Borthakur says. Existing open-source software will continue to run its course, but there is no incentive for builders or users to choose open source over open services for new data offerings..Ironically, it was ease of adoption that drove the open-source wave, and it is ease of adoption of open services that will precipitate the demise of open source particularly in areas like data management. Just as the last decade was the era of open-source infrastructure, the next decade belongs to open services in the cloud.

Related Items:

2019: A Big Data Year in Review Part One

2019: A Big Data Year in Review Part Two

Read more:
Big Data Predictions: What 2020 Will Bring - Datanami

Hugging Face Raises $15 million to Expand its Open Source Software on Conversational AI – IBL News

IBL News | New York

New York-based Hugging Face, a startup known byan app launched in 2017 that allows you to chat with an artificial digital friend,recently open-sourced its library for natural language processing (NLP) framework, calledTransformers. It hadmassive success asthere are over a million downloads and 1,000 companies using it, including Microsofts Bing.

Transformers can be leveraged for text classification, information extraction, summarization, text generation, and conversational artificial intelligence.

On Tuesday, Hugging Face, with just 15 employees, announced theclose of a $15 million series, a funding round that adds to a previous amount of $5 million.

The round, intended totriple Hugging Faces headcount in New York and Paris and the release of new software libraries,was ledby Lux Capital, with participation from Salesforce chief scientist, Richard Socher, and OpenAI CTO Greg Brockman, as well as Betaworks and A.Capital.

Tech giants are not taking a truly open-source approach on NLP, and their research and engineering teams are totally disconnected,Hugging Face CEO, Clment Delangue, said on VentureBeat.

On one hand, they provide black-box NLP APIs like Amazon Comprehend or Google APIs that are neither state-of-the-art nor flexible enough. On the other hand, they release science open source repositories that are extremely hard to use and not maintained (BERTs last release is from May and only counts 27 contributors).

See the original post here:
Hugging Face Raises $15 million to Expand its Open Source Software on Conversational AI - IBL News

How RISC-V is creating a globally neutral, open source processor architecture – VentureBeat

Arm dominates the microprocessor architecture business, as its licensees have shipped 150 billion chips to date and are shipping 50 billion more in the next two years. But RISC-V is challenging that business with an open source ecosystem of its own, based on a new kind of processor architecture that was created by academics and is royalty free.

This month, 2,000 engineers attended the second annual RISC-V Summit in San Jose, California. The leaders of the effort, including nonprofit RISC-V Foundation CEO Calista Redmond, said they see billions of cores shipping in the future.

RISC-V started in 2010 at the University of California at Berkeley Par Lab Project, which needed an instruction set architecture that was simple, efficient, and extensible and had no constraints on sharing with others. Krste Asanovic (a founder of SiFive), Andrew Waterman, Yunsup Lee, and David Patterson created RISC-V and built their first chip in 2011. In 2014, they announced the project and gave it to the community.

RISC-V enables members to design processors and other chips that are compatible with software designed for the architecture, and it means licensees wont have to pay a royalty to Arm. RISC-V is politically neutral, as its moving its base to Switzerland. That caught the attention of executives, including Infineon CEO Reinhard Ploss, according to RISC-V board member Patterson. With RISC-V, Chinese companies wouldnt have to depend on Western technology, which became an issue when the U.S. imposed tariffs and Arm had to determine whether it could license U.S. technology to Huawei.

Perhaps because of this, RISC-V activity is picking up around the globe. Redmond said in an interview with VentureBeat that RISC-V is creating a technological revolution. Its not clear how many RISC-V startups there are, but the group counts more than 100 member companies with fewer than 500 employees.

Heres an edited transcript of our interview.

Above: Calista Redmond is CEO of the RISC-V Foundation.

Image Credit: Dean Takahashi

VentureBeat: There was a little bit of comment about the burst of people in China that are interested [in RISC-V], partly because of what Huawei is doing.

Calista Redmond: I havent seen a burst of people in China. Im not sure if someone else saw something I didnt, but the membership has grown steadily at a global level If you look at our line graph, its continuous. We didnt see a spike at any particular point.

VentureBeat: The representation in China, what would you say about that? Where is it relative to the whole world?

Redmond: In terms of global members, we have [fewer] than 50. I dont remember the exact number off the top of my head. Its somewhere between 30 and 50. China has two groups of interested organizations that have 200 members. Some of those are also members of the global foundation, and some are just members of two RISC-V groups that have self-assembled in China. Theres CRVIC and CRVA. One is focused more on academic interests and one is focused more on industry interests. We collaborate with both of those groups on activities of global interest in China.

VentureBeat: There are all these different things that are appealing. Is it zero license fees, or ?

Redmond: Theres no license fee. When you get to open source and youre looking at the ISA spec, thats open and freely available. You do need to be a member of the foundation to leverage the trademark, but everything is publicly available. Theres no license fee. A license fee would indicate a commercial relationship, and were a nonprofit foundation. We dont have commercial relationships. The way that we generate revenue is through membership fees, which are not attributed in a royalty or license structure, like you would with traditional proprietary ISAs.

VentureBeat: Is this territory-free designation its appealing to those who might face some kind of political border?

Redmond: When you open-source something, its globally open and available. That IP is not governed by a geographic jurisdiction. Thats how all open source works. Thats how global open standards have worked for 100 years.

Above: The expo floor at the RISC-V Summit. Its small for now.

Image Credit: Dean Takahashi

VentureBeat: This is a key difference between you and Arm, though. Someone might not choose Arm because of this distinction.

Redmond: Arm also has some open IP. So does Intel. So does Power. From a base building block, RISC-V does not have any other tangential requirements to it that you might find in other models. At its base, there are two areas that you make decisions in. One, technically, am I going to be able to accomplish what I need to for the workload I need to serve? Two, does the business model fit for my incredible investment and long-term strategic durability? Its both of those pieces. More and more, technology decisions you have lots of choice. It comes down to a business reason.

VentureBeat: David Patterson said the CEO of Infineon was interested, because you were moving the headquarters to Switzerland?

Redmond: Weve had remarks made from different companies, different geographies, that indicate to us that they would be more open to investing in RISC-V if they felt that that gave them some level of comfort. The incorporation in Delaware as it is today first of all, none of us lives in Delaware. Moving it from Delaware to Switzerland has no fundamental difference. We are not circumventing any regulations. We are not an insurance policy. We are fundamentally doing it because it calms some of the concerns that we have seen in the greater community.

VentureBeat: Theres a perception when youre a U.S. nonprofit, as opposed to a neutral nonprofit.

Redmond: Could regulations change? Who knows, in the future? But they havent changed in decades of open source software development. I dont think theyre going to change in hardware.

VentureBeat: The numbers of chips are you going to start counting how much progress youre making every year?

Redmond: The counts that have been surfacing and the reports that youve seen have been on cores, not on chips. RISC-V is more focused on cores, just as an equalizer in what we can count versus chips, because a chip may have two cores or it may have 30 cores. Now, Chips Alliance or another group may be more interested in chips, but then they probably need to delineate between different kinds multi-threaded, whatever.

Above: RISC-V board members, (left to right): Krste Asanovic, Zvonimir Bandic, Ted Speers, Calista Redmond, Frans Sijstermans, and Rob Oshana.

Image Credit: Dean Takahashi

VentureBeat: Is there a long run there that matters? Youre not a measurable percentage of Arms shipments, right?

Redmond: I dont think were competing head to head with Arm or Intel or Power or anything in that way. I get what youre looking at. Were sort of the sweet spot for RISC-V is starting in this space where there isnt a current entrenched participant. Arm came in on mobile when there was no clear leader in that space. Intel survived and dominated in servers and desktops amongst many competitors.

I dont think that there is a declared winner yet in embedded or in some of the new IoT spaces, or AI. Thats where I think youre going to see the most adoption of RISC-V initially. Then youll see an additional rise after that where RISC-V may be looked at in the next generations of some of those architectures that had previously been there. But most companies dont like to rip and replace prior investments.

VentureBeat: The Samsung talk seemed interesting, where theyre not going to replace old things, but theyre adding this in.

Redmond: Theyre focused on the next generation. You heard them talk a lot about 5G. Modems, sensors, automotive. Thats just it, right? Youre seeing a lot of these companies start to diversify into the adjacent spaces of their core businesses. In those adjacent spaces, thats exactly the point I was getting to earlier. Thats how youre starting to see the advances that RISC-V is making, in those new spaces.

VentureBeat: I guess thats how you gain market share. Youre not replacing things, but as the new things youre in take off, they become a bigger part of the market.

Redmond: And where do you think the fastest growth rates are? Old spaces or new spaces? Probably those new spaces have attractive growth rates. Not always, but often.

Above: RISC-V co-creator Krste Asanovic at the RISC-V Summit.

Image Credit: Dean Takahashi

VentureBeat: Do you then have to do anything to prioritize that particular space?

Redmond: Its an interesting question. From our start, we started working on the base building blocks. Heres your core, your basic ISA. Here are these extensions that you can pick and choose off the menu, what youd like to include. Then, as weve matured as an organization, were starting to go up from that into software. As we get into software, we need to prioritize implementation stacks so that you have a single homogenous kind of perspective on that, from which of course you can diverge at any point in the path, but heres our recommended path to take a fully open approach. Those implementations could be in embedded. They could be in mobile. They could be some of the scale-out HPC-type of realm. But we are starting to look at that more so, as we look into the software side, as we look deeply at the extensions.

VentureBeat: As far as people taking RISC-V seriously, what would you point to as the milestones that are getting you into more conversations or into more doors?

Redmond: Nvidia is already including it as part of their core product.

VentureBeat: They came on board in 2016?

Redmond: I dont remember the exact date, but its definitely out there. Theyre up to millions of cores at this point. Western Digital, a billion cores. The trajectory, as more large organizations come on board, as well as the startups start to make progress the level of investment in startups is continuing to grow. The VC spend on RISC-V is starting to grow. You see some of those early successes with the likes of SiFive.

VentureBeat: Is that something youve figured out yet, how many startups are going?

Redmond: No, but we have about 100 companies that [have fewer] than 500 people in the organization today. I talk to new startups constantly. I would wager that a strong percentage I dont know what that percentage would be of our individual members are also looking at RISC-V as an area to do a startup business.

Above: The keynote crowd at the RISC-V Summit.

Image Credit: Dean Takahashi

VentureBeat: Its probably true, then, that a large percentage of chip design startups are going to be RISC-V? Or at least processor designs?

Redmond: Its difficult to start on a different architecture. There are higher barriers to entry if you want to go Arm or Intel or something else. Were really equalizing there.

VentureBeat: Because youre going into established markets?

Redmond: No. We bring an established and growing community and ecosystem and partners and tools and resources and reference designs and cores and extensions. We have all of those pieces that give you a running start as an entrepreneur without the burden of licensing royalties or other commitments to your business. It is a much easier business and technical decision to make.

VentureBeat: I detect a certain nervousness at Arm, even though both sides are saying that theyre not directly competing with each other. Its the way Intel used to talk about the x86 startups.

Redmond: Its like were all at the same dance. The music has changed a little bit, and now were trying to figure out how we all move in that space. Its an interesting dynamic. There are adjustments going on across companies. You see it at Intel, at Arm, at Power. RISC-V is just the latest to show up at the party, and weve come at it with a completely different approach.

VentureBeat: This other thing about momentum if you cause the other guys to change in some way, youre having an impact in the market. If Arm starts doing these custom instructions because you guys provide an alternative, thats a change in the market.

Redmond: Another interesting perspective is, where do you see different architectures all in the same chip? Its possible for Arm and RISC-V to coexist. How do you navigate that frontier? That was really interesting with the OmniXtend fabric from Western Digital. Do you want to share memory, share network, share storage? Heres how we can do this across multiple architectures.

What RISC-V has brought to the game is youre no longer locked into one architecture choice. Which also means youre not locked into one vendor. That vendor lock-in is something the industry has been concerned about for decades.

Above: SiFive is making licensable RISC-V CPUs.

Image Credit: SiFive

VentureBeat: They made that strong statement about how memory doesnt need to be tied to a processor.

Redmond: Right. Look at what happened in the software space. It was that lock-in that really gave a significant nudge to the open source movement in the first place.

VentureBeat: I remember the days when Microsoft was going around the world getting Windows declared the operating system of entire countries.

Redmond: Right, right. Its interesting to see how they as an organization have shifted as well. Theyve started to embrace business models need to evolve. You look back at the Industrial Revolution. At some point we thought coal trains were the greatest, but eventually we got to airplanes. Its the same in our space.

More here:
How RISC-V is creating a globally neutral, open source processor architecture - VentureBeat

This Week In Security: Unicode, Truecrypt, And NPM Vulnerabilities – Hackaday

Unicode, the wonderful extension to to ASCII that gives us gems like , , and , has had some unexpected security ramifications. The most common problems with Unicode are visual security issues, like character confusion between letters. For example, the English M (U+004D) is indistinguishable from the Cyrillic (U+041C). Can you tell the difference between IBM.com and IB.com?

This bug, discovered by [John Gracey] turns the common problem on its head. Properly referred to as a case mapping collision, its the story of different Unicode characters getting mapped to the same upper or lowercase equivalent.

''.toLowerCase() === 'SS'.toLowerCase() // true// Note the Turkish dotless i'John@Gthub.com'.toUpperCase() === 'John@Github.com'.toUpperCase()

GitHub stores all email addresses in their lowercase form. When a user sends a password reset, GitHubs logic worked like this: Take the email address that requested a password reset, convert to lower case, and look up the account that uses the converted email address. That by itself wouldnt be a problem, but the reset is then sent to the email address that was requested, not the one on file. In retrospect, this is an obvious flaw, but without the presence of Unicode and the possibility of a case mapping collision, would be a perfectly safe practice.

This flaw seems to have been fixed quite some time ago, but was only recently disclosed. Its also a novel problem affecting Unicode that we havent covered. Interestingly, my research has turned up an almost identical problem at Spotify, back in 2013.

TrueCrypt is an amazing piece of software that literally changed the world, giving every computer user a free, source-available solution for hard drive encryption. While the source of the program was made freely available, the license was odd and restrictive enough that its technically neither Free Software, nor Open Source Software. This kept it from being included in many of the major OS distributions. Even at that, TrueCrypt has been used by many, and for many reasons, from the innocent to reprehensible. TrueCrypt was so popular, a crowdfunding campaign raised enough money to fund a professional audit of the TrueCrypt code in 2013.

The story takes an odd turn halfway through the source code audit. Just after the initial audit finished, and just before the in-depth phase II audit was begun, the TrueCrypt developers suddenly announced that they were ending development. The TrueCrypt website still shows the announcement: WARNING: Using TrueCrypt is not secure as it may contain unfixed security issues. Many users thought the timing was odd, and speculated that there was a backdoor of some sort that would be uncovered by the audit. The in-depth audit was finished, and while a few minor issues were discovered, nothing particularly serious was uncovered.

One of the more surprising users of TrueCrypt is the German government. It was recently discovered that the BSI, the information security branch of the German government, did an audit on TrueCrypt back in 2010.

Many governments have now have laws establishing the freedom of information, granting a right-to-know to their citizens. Under these laws, a citizen may make an official request for documentation, and if such documentation exists, the government is compelled to provide it, barring a few exceptions. A German citizen made an official request for information regarding TrueCrypt, particularly in regards to known backdoors in the software. Surprisingly, such documentation did exist!

Had the German government secretly backdoored TrueCrypt? Were they part of a conspiracy? Probably not. After some red tape and legal wrangling, the text of the audit was finally released and cleared for publication. There were some issues found back in 2010 that were still present in the TrueCrypt/Veracrypt source, and got fixed as a result of this report coming to light.

The Node Package Manager, that beloved repository of all things Javascript, recently pushed out an update and announced a pair of vulnerabilities. The vulnerabilities, simply stated, were both due to the lack of any sanity checking when installing packages.

First, the binary install path wasnt sanitized during installation, meaning that a package could attempt to interact with any file on the target filesystem. Particularly when running the NPM CLI as root, the potential for abuse is huge. While this first issue was taken care of with the release of version 6.13.3, a second, similar problem was still present in that release.

Install paths get sanitized in 6.13.3, but the second problem is that a package can install a binary over any other file in its install location. A package can essentially inject code into other installed packages. The fix for this was to only allow a package to overwrite binary files owned by that package.

The upside here is that a user must install a compromised package in order to be affected. The effect is also greatly mitigated by running NPM as a non-root user, which seems to be good practice.

Google provides a bunch of services around their cloud offering, and provides the very useful web-based Cloud Shell interface for managing those services. A researcher at Offensi spent some time looking for vulnerabilities, and came up with 9 of them. The first step was to identify the running environment, which was a docker image in this case. A socket pointing back to the host system was left exposed, allowing the researcher to easily escape the Docker container. From there, he was able to bootstrap some debugging tools, and get to work finding vulnerabilities.

The vulnerabilities that are detailed are interesting in their own right, but the process of looking for and finding them is the most interesting to me. Google even sponsored a YouTube video detailing the research, embedded below:

Using an iPhone to break the security of a Windows machine? The iPhone driver sets the permissions for a certain file when an iPhone is plugged into the machine. That file could actually be a hardlink to an important system file, and the iPhone driver can unintentionally make that arbitrary file writable.

The Nginx web server is currently being held hostage. Apparently the programmers who originally wrote Nginx were working for a technology company at the time, and now that the Nginx project has been acquired, that company has claimed ownership over the code. Its likely just a fraudulent claim, but the repercussions could be far-reaching if that claim is upheld.

OpenBSD has fixed a simple privilege escalation, where a setuid binary is called with a very odd LD_LIBRARY_PATH a single dot, and lots of colons. This tricks the loader into loading a user owned library, but with root privileges.

Read the original here:
This Week In Security: Unicode, Truecrypt, And NPM Vulnerabilities - Hackaday

One of Amazons first employees says the company should be broken up – Vox.com

Paul Davis literally helped build Amazon.com from scratch. Now he says its time to tear it apart.

Davis, a computer programmer who was Jeff Bezos second hire in 1994 before the shopping site even launched, told Recode on Friday that the company should be forced to separate the Amazon Marketplace, which allows outside merchants to sell goods to Amazon customers, from the companys core retail business that stocks and sells products itself.

His reasoning? Hes troubled by reports of Amazon squeezing and exploiting the merchants who stock its digital shelves in ways that benefit Amazon, the company, above all else. Davis concerns come as Bezos company has come under increased scrutiny from politicians, regulators, and its own sellers, in part over the power it wields over small merchants who depend on the tech giant for their livelihoods.

Theres clearly a public good to have something that functions like the Amazon Marketplace. If this didnt exist, youd want it to be built, Davis said. Whats not valuable, and whats not good, is that the company that operates the marketplace is also a retailer. They have complete access to every single piece of data and can use that to shape their own retail marketplace.

Davis is referring to how Amazon uses data from its third-party sellers to benefit its core retail business, whether it be by scouring these merchants best-sellers and then choosing to sell those brands itself, or to create its own branded products through similar means.

Theyre not breaking any agreements, he added. Theyre just violating what most people would assume was how this is going to work: I sell stuff though your system [and] youre not going to steal our sales.

Davis comments appear to be one of the first times that an early Amazon employee has called for the company to be broken up. Earlier this year, US presidential candidate Elizabeth Warren argued for the same. And both the US House of Representatives and the Federal Trade Commission are scrutinizing Amazons business practices to determine if they are anticompetitive, including its dealings with the hundreds of thousands of merchants who are the backbone of Amazons unmatched product catalogue.

An Amazon spokesperson sent Recode a statement, which read in part: Sellers are responsible for nearly 60% of sales in our stores. They are incredibly important to us and our customers, and weve invested over $15 billion dollars this year alonefrom infrastructure to tools, services, and featuresto help them succeed. Amazon only succeeds when sellers succeed and claims to the contrary are wrong. Sellers have full control of their business and make the decisions that are best for them, including the products they choose to sell, pricing, and how they choose to fulfill orders.

Amazon has also previously said that it only uses aggregate seller data versus data from individual sellers to inform its decisions on which products to create under its own brand names.

Davis comments to Recode came after he posted an online comment alongside a New York Times article earlier this week about the challenges sellers face while doing business on Amazon.

For nearly 2 decades Amazon has used its control of its marketplace to strengthen its own hand as a retailer, Davis wrote. This should not be allowed to continue.

The Times article highlighted various ways that Amazon allegedly puts pressure on the merchants who are responsible for nearly 60 percent of all Amazon physical product sales, including burying their listings if they are selling the same product for less elsewhere and making it hard for brands that dont advertise on the site from showing up at the top of search results. (Recode spotlighted similar complaints from sellers in an episode of the Land of the Giants podcast series this summer.)

Davis wrote the backend software for the first iterations of the Amazon.com website from 1994 into 1996. He left the company after a year and a half and following the birth of his first child, in part, he said, because of the culture Bezos was creating that churned through good employees, whom Davis says were worked into the ground.

Still today, Davis marvels at what Bezos and his leadership team have built over the past two decades, and he says he shops on Amazon regularly.

We exist with multiple hats: Were citizens, [were] employees, were parents, were consumers and, from my perspective, if you put the consumer hat on, its easy to feel incredibly proud of what Amazon is and has become, Davis said. But the problem is that thats not the only hat that we wear and its fine to celebrate and be optimistic and positive about what the company represents for consumers but you also have to ask seriously, what does the company represent [to us] as citizens, as employees. And unfortunately, you have to be incredibly naive not to see that the answers to those questions are nowhere near as positive.

It is an amazing story, he added, referring to the companys innovation and success, but as time goes forward my gut feeling is that it will not only not be the whole story, but really the smallest part of the story. In addition to finding issue with Amazon operating simultaneously as retailer and marketplace, Davis also wonders why such a powerful and, now, profitable company cant pay the frontline workers in its warehouses and delivery network better.

Today, Davis lives in a small New Mexico town and writes open source software for recording and editing audio. He said he knows its absurd to feel any sort of responsibility for the power that Amazon holds today.

I doubt theres a single line of code or concept that dates back to when I was there.

He also stressed that most of the companys early success should be attributed to Bezos intellect, ambition, and drive.

But at times, doubts do creep in for Davis. They emerge when he allows himself to consider what might have been if he, and Amazons first employee fellow programmer and Amazons first Chief Technology Officer Shel Kaphan hadnt been the type of technical talents that understood the internet in its earliest days.

Emotionally, Davis said, I do feel some kind of culpability.

Here is the original post:
One of Amazons first employees says the company should be broken up - Vox.com