Gammy is an adaptive brightness application for Windows and Linux – Ghacks Technology News

You've probably heard of or used applications like F.lux or Lightbulb, which dim the screen to display warmer colors at night. Windows 10 has a night mode that does a similar job.

But what about overly bright applications during the day? They can still be a nuisance, and changing to night mode isn't exactly a good solution. Gammy is an open source software that can help you in such scenarios. This portable application supports adaptive brightness that makes bright on-screen content easy on the eyes.

Run it and you'll see it starts on the system tray. The second you run it, you'll notice that your brightness has automatically been dimmed (if your display's brightness was set to a high level). Double click the tray icon and an interface pops-up. This is an always on top window, so you can use different programs and observe how the brightness changes.

The interface has a bunch of sliders which you use to set the minimum and maximum brightness levels, so the program doesn't dim the screen or increase brightness too much. The offset percentage is the setting that Gammy uses to calculate the brightness, if it's higher the brightness will be as well.

The Temperature setting is used to define the color temperature levels, similar to that in F.lux and other screen dimming applications. The maximum temperature is 6500K and goes down to 2000K. Enable the "auto" option to let Gammy automatically adjust the brightness at a time that you select. To set the time click on the three-dot button next to the option, and you'll be able to set the Start and End time.

Note: The Linux version has a padlock button that supports cranking the brightness level all the way up to 200%. The application is otherwise similar to the Windows program.

Uncheck the auto brightness option and you'll see another slider appear that you can use to adjust the brightness level manually. Click and drag the bottom right corner of the interface to view three additional options. This allows you to control the Adaption speed, Threshold and Screenshot rate. The Adaption Speed determines how fast the brightness changes, while threshold checks for the difference in content to apply the change. Screenshot rate is explained in the next section.

If you have a smartphone, you may be aware how the automatic brightness works on it. Most handsets these days have a special component called ambient light sensor, which, as the name indicates, detects the amount of light that's available and adjusts the screen brightness accordingly.

Such sensors aren't available on computers, so how does Gammy support Adaptive Brightness? According to the documentation on the developer's website, Gammy takes a screenshot from time to time. The program uses the contents (colors) as a reference to adjust the brightness of the screen. The Screenshot rate configured in milliseconds is the time taken between two screenshots.

Note: The application uses the GPU to take the screenshot, and the interface that it uses is apparently not available in Windows 7. So, technically Gammy is only supported on Windows 8.1 and Windows 10.

Obviously it is not possible to show you the difference in the brightness level in a video recording of the screen. But, here's a demo showing how the program changes the setting, and how quickly it happens.

Try it yourself. If you are working on an application that has a dark background, the brightness level will be reduced automatically. Switch to a program with a lighter backdrop like Notepad, and it will raise the brightness. It works fine even when shifting from one browser tab to another. You may set Gammy to run at startup by right-clicking the tray icon.

Gammy is an open source application. The Windows version requires Visual C++ 2017. The Linux build is a Qt5 app. Instructions for compiling and running the Linux version are available on the download page.

Author Rating

Software Name

Gammy

Operating System

Windows, Linux

Software Category

Multimedia

Price

Free

Landing Page

See the original post here:
Gammy is an adaptive brightness application for Windows and Linux - Ghacks Technology News

GitHub’s Plan to Freeze Your Code for Thousands of Years – thenewstack.io

Recently I discovered some computer code Id written will outlive me for many centuries. A copy of it has been stored in a chilly cave in the Arctic Circle.

Its part of a fascinating project by GitHub, the 2020 Arctic Vault program, which brings modern technologies into a surprisingly primitive environment to deliver an unexpected honor for a wide swath of the 100 million code repositories currently hosted on GitHubs servers, by archiving of all this material in perpetuity in an exotic archipelago in Norway, near the northernmost town in the world.

GitHubs vice president of special projects, Thomas Dohmke, tells news.com.au that GitHub is uniquely positioned for the archival, and has the responsibility to protect and preserve the collaborative work of millions of developers around the world. On its webpage for the project, GitHub strikes a similarly grand tone, calling open source software a hidden cornerstone of modern civilization, and the shared heritage of all humanity.

We will protect this priceless knowledge by storing multiple copies, on an ongoing basis, across various data formats and locations, he said.

On a visit, GitHubs CEO Nat Friedman described the storage location, a decommissioned coal mine, as more mine-y and rustic and raw-hole-in-the-rock than I thought it would be, according to a recent article in Bloomberg. The news service goes on to note that, to Friedman, its a natural next step. Open source software, in his view, is one of the great achievements of our species, up there with the masterpieces of literature and fine art.

And its not the only priceless knowledge being stored in this remote location. According to Bloomberg,the other shelves in the mine include Vatican archives, Italian movies, Brazilian land registry records, and the recipe for a certain burger chains special sauce.

But whats the rationale for this massive effort? The projects page cites the threat of code being abandoned, forgotten, or lost. Worse yet, how would the code be otherwise saved in case of a global catastrophe?

There exists a range of possible futures in which working modern computers exist, but their software has largely been lost to bit rot. The GitHub Archive Program will include much longer-term media to address the risk of data loss over time, the site notes.

Of course, the code repository services has also given some thought to how the future might use our code. There is a long history of lost technologies from which the world would have benefited, as well as abandoned technologies which found unexpected new uses, explains the project web page. It is easy to envision a future in which todays software is seen as a quaint and long-forgotten irrelevancy until an unexpected need for it arises.

Future historians might see the significance in our age of open source ubiquity, volunteer communities, and Moores Law.

Which code blocks make the cut? According to GitHub: The archive will include every repo with any commits between the announcement at GitHub Universe on Nov. 13 and 02/02/2020, every repo with at least 1 star and any commits from the year before the snapshot (02/02/2019 02/02/2020), and every repo with at least 250 stars. Plus, gh-pages for any repository that meets the aforementioned criteria.

The Norwegian data-storing company Piql, whose custom film and archiving technologies will allow the project to store terabytes of data for over 1,000 years, brags that code is now headed into the gold standard of long-term data storage.

But besides offering vault storage services, Piql also offers a unique form of data digitization. Piql is storing the code on hundreds of reels of film made from polyester and silver halide. Bloomberg points out theyre coated with an iron oxide powder for added Armageddon-resistance. Each of its microfilm-like frames holds over 8.8 million pixels. Piql explains that its method involves converting 1s and 0s into QR code. No electricity or other human intervention is needed as the climatic conditions in the Arctic are ideal for long-term archival of film, explained a Piql web page.

By using a self-contained and technology-independent storage medium, future generations will be able to read back the information, according to Piql. The project also includes instructions on how to unpack and read the code.

Bloomberg even notes that theres a treaty in place which keeps Svalbard neutral in times of war. Because its all stored on offline film reels, GitHub doesnt have to worry about power outages. An added layer of security comes from its remote location. One GitHub video points out that the Svalbard archipelago is home to the northern-most town in the world as well as thousands of polar bears. The videos description notes that though its called the GitHub Arctic Code Vault, its actually closer to the North Pole than the Arctic Circle.

Its been fun to watch the reactions to GitHubs video. The future will be amazed by my JavaScript Calculator, joked one comment.

Others couldnt resist commenting on the Arctic location. (Now my code can freeze before it even gets run) Another naysayer even quipped, When your code is so bad that you need to bury it under the permafrost

GitHubs FAQ says the company plans to re-evaluate the program (and its storage medium) every five years at which point itll decide whether to take another snapshot.

And if youre curious what its like in a Svalbard mine, a nearby coal mine is offering tours. Most of Svalbards old Norwegian and Russian coal mines have shut down, explains Bloomberg, so locals have rebranded their vast acres of permafrost as an attraction to scientists, doomsday preppers, and scientist doomsday preppers.

Link:
GitHub's Plan to Freeze Your Code for Thousands of Years - thenewstack.io

Chips that pass in the night: How risky is RISC-V to Arm, Intel and the others? Very – The Register

Column How well does Intel sleep? It's just rounded off a record year with a record quarter, turning silicon into greenbacks more efficiently than ever, redeeming recent wobbles in the data centre market and missteps in fabrication with double-digit growth.

The company should be slumbering with all the unworried ease of Scrooge McDuck on a mattress stuffed with thousand-dollar bills. Yet the wake-up packet of unease should be pinging its management port with some insistence right now.

Intel remains a one-trick pony, entirely dependent on the x86 ISA. It has no game in GPUs, it is tuning out of its 5G interests, it has long handed over handsets to Arm. It has memory, it has Wi-Fi, it has wired networking, but compared to the cash cows of edge and central x86, these are barely cash coypu.

One barbarian is at the gates with a refurbished siege engine. AMD has finally got its architectural, process node and marketing acts together and is making up for lost time while Intel is still recalibrating from 10nm disappointment. Yet this is familiar turf for Intel, which remains a very formidable machine with enormous resources and flexibility. When AMD still had its own chip fabs a decade or so ago and was having its own process woes, it suffered: Intel is making record profits. It knows how to sell chips on its own turf. It'll have some bumps getting out of 10nm and the next couple of years may not be quite such record-breakers, but x86 remains its to lose.

The smaller, nimbler and more exciting competitor is going to be harder to defend against in the long term. As it prepares to celebrate the 10th year since its inception, RISC-V is showing the most dangerous trait in any competitor, the ability to redefine the ecosystem.

RISC-V has its conceptual roots in 1980s Berkeley, in part as a direct reaction to the trend towards increasing CPU complexity exemplified by Intel's development of the 8080 via the 8086 into the 80386 during the same epoch. That added instruction set features in silicon as Moore's Law made more transistors affordable; RISC went the other way, keeping the core features small and using Moore's Law to speed them up.

RISC-V, as a collaborative foundation of semiconductor companies, was formed in 2015.

As an architecture, it came into being in 2010, again at Berkeley, in the Parallel Computing Laboratory funded oh, the irony by Microsoft and Intel. It absorbed all the lessons of the previous 30 years, not just architecturally but in how the industry itself worked. The RISC idea saw some success among traditional processor companies, but the big winner was the British upstart Arm technically clever and with what proved a killer processing power per watt advantage, but which really shone because it was licensed, not made. Manufacturers bought the design, not the chip, and mixed it in with their own circuitry. Couldn't do that with Intel.

RISC-V takes that further. The obvious advantage over Arm is that RISC-V's instruction set architecture is open source; you can just use it as you wish without paying royalties. But like open-source software, the fact its free is misleading. You can buy a feature phone with an Arm-based chip in it for a tenner: whatever pennies of that go in CPU licensing don't matter. What RISC-V has that Arm doesn't is extensibility. If you need to add features in the instruction set, go ahead. If you need to tune for very low power or very high throughput, you can.

Even that wouldn't be much of an advantage by itself. Designing architectural innovations in silicon is like architecture in brick; easy enough on paper, but until you build the bugger you can't be sure it won't fall down. The process of checking a design for reliable operation is called verification, and when you have a billion-transistor-class CPU with near-infinite permutations of state you only get to verify as much as you can afford: nowhere close to the whole thing. ARM, Intel, AMD, IBM et al spend a lot of time and money in verification so they can sell a complete design with a plausible "Trust us" stamp on it. If you're building your own RISC-V design and can't afford equivalent verification, how do you trust yourself?

The good news for the RISC-V ecosystem is that verification tools are appearing that automate the process as far as possible. Open source means the majority of your CPU design has been very well tested; your innovations live in an understood and exercised environment, just as open-source software has produced an exceptionally stable yet extensible environment. Conversely, the "Trust us" stamp is looking quite tarnished. Heart Bleed, Spectre and the very latest Intel Management Engine vulnerability are all either signs of verification failure or, even worse, problems that came out during verification but were too expensive to fix and too dangerous to admit. That's why buildings fall down.

So, at the same time as the monolithic approach to CPU design is looking the most vulnerable, the RISC-V approach is getting the same momentum as open source software did in the Noughties. It's in supercomputers. It's in IoT. Samsung is making it. The tools are appearing, the people are learning it, it's becoming the right answer to a lot of questions.

To be fair, Intel shouldn't be losing as much sleep over RISC-V as Arm, which now runs the risk of being another of SoftBank's brilliantly timed investments. Yet the openness and expanding ecosystem of RISC-V has the potential as no other competitor does of restricting Intel's home-turf advantage, much as Microsoft lost the web and mobile to open-source software based on common architectural ideas.

It doesn't matter how good a dinosaur you are if your environment changes. That's what RISC-V represents.

Sponsored: Quit your addiction to storage

More here:
Chips that pass in the night: How risky is RISC-V to Arm, Intel and the others? Very - The Register

Unleash the Software Magic with SUSE at the Reach of Your Computer – Tech Times

When one is talking aboutsoftwarein a business sense, one does not mean software that is linked to entertainment or something unproductive, the type of software used for business is exactly the ones SUSE has to offer! With an ecosystem of partners that work well and on the deadline to deliver the perfect enterprise-grade, the open software-defined infrastructures are that your company needs are in good hands.

(Photo : Screenshot From SUSE Official Facebook Page)SUSE.com

Looking at things on a broader scale, it is not just the product that matters but also the company itself and one way to tell if a company is doing well is how the company treats its employees! If a company is more family oriented, this goes to show that the company cares about the product as much as the profits unlike profit oriented companies wherein the product is just a means towards profits which sometimes lead those companies to underperform when it comes to their products or services.

Read Also:Is the Samsung Galaxy s 20 Ultra Worth it? DEFINITELY NOT! Experts Claim that the $1,400 is not fair!

When looking at an ideal company, SUSE is definitely one of them. Being the world's largest independent open source software company did not come easy and as this company has grown, so has the growing 1,600 employees of this company! Every single employee plays a heavy part in SUSE's success which is why the company is great when it comes to your open source software needs.

When doing business with SUSE, they do not treat you as a customer but rather as a partner. The goal of SUSE is to make sure that the system works perfectly for you and your company as well. The announced revenue for the FY2020 has grown 12 percent YoY which is a great 67 percent surge in cloud revenue! Achieving this much for an independent company took quite a lot of work and research but the brilliance of SUSE is definitely showing.

The modernized, simplified, and accelerated tradition and cloud native applications have been what SUSE has offered to the market as their IT landscape is beneficial to about any type of platform. With the company both growing in skill and size, SUSE is celebrating its success together with its company for the good returns reported of Q1 of 2020.

Read Also:PS5 Last Console: Could a Price Hike be the End of Sony's PlayStation Legend?

When dealing with corporate materials, the demand for an enterprise-grade software does increase as the basic software does not do you any good anymore. Building your very own software from scratch would take years to perfect and instead of spending that time wisely on the growth of your company, you might find yourself lost trying to develop an enterprise-grade software yourself. Let SUSE do it for you. Leave the complicated stuff to the experts while you focus on things that truly matter like the expansion of your company. SUSE is here to help you with your software needs not just as a business, but as a partner.

2018 TECHTIMES.com All rights reserved. Do not reproduce without permission.

Read more:
Unleash the Software Magic with SUSE at the Reach of Your Computer - Tech Times

Radisys delivers its Engage AI-based media apps on OpenNESS to accelerate 4G and 5G networks innovation – Help Net Security

Radisys, a global leader of open telecom solutions, announced the deployment of the Radisys Engage portfolio of digital engagement and AI-based real-time media applications on Open Network Edge Services Software (OpenNESS), an open source multi-access edge compute (MEC) platform initiative led by Intel to accelerate innovation and unique experiences on 4G/LTE and 5G networks.

The advent of 5G and massive IoT applications require ultra-low latency, high-bandwidth, and real-time access to radio network resources, leading to the rise of multi-access edge computing to enable virtualized applications to be deployed on compute resources closest to the edge.

However, the lack of broad industry standardization for MEC has led to fragmentation in the development and deployment of MEC platforms, thereby hindering wide-scale adoption.

The OpenNESS platform abstracts complex networking technology and provides microservices/APIs resulting in an easy-to-use toolkit to develop and deploy applications at the network edge.

Radisys Engage advanced real-time media applications are available on the OpenNESS platform, enabling new digital experiences.

We are pleased to deliver a complete and open MEC platform that comes with ready to deploy edge media applications, said Adnan Saleem, CTO, Software and Cloud Solutions, Radisys.

Through collaboration with Intel, our solution will help service providers to realize the ultra-low latency benefits of 5G, while enabling rich new applications like augmented reality, localized collaboration, improved security, and more.

Intel is collaborating with the Network Builders ecosystem to deliver open solutions that enable service providers to accelerate innovation while controlling complexity and costs, said Renu Navale, Vice President & General Manager, Edge Computing and Ecosystem Enabling at Intel.

By adopting and integrating OpenNESS Intels open source software for network edge, Radisys Engage media-centric applications provide the industry with a unique platform for real-time media applications and services.

See the original post here:
Radisys delivers its Engage AI-based media apps on OpenNESS to accelerate 4G and 5G networks innovation - Help Net Security

What Are The Most Common Issues With Free Open Source Software? – Analytics India Magazine

Free and Open Source Software (FOSS) has become a prominent aspect of the new age global economy. It has been analysed that FOSS makes up about 80-90% of any particular piece of todays software. It is to be noted that software is an increasingly-critical resource in almost all businesses, both public and private. But, there are many issues with FOSS, according to the Linux Foundation.

The Linux Foundation established the Core Infrastructure Initiative (CII) in 2014 as a part of which its members gave funding and support for FOSS projects, which are important to worldwide data and information infrastructure. In 2015, CII finished the Census Project (Census I) to find out which software packages in the Debian Linux distribution had been the most important to the kernels overall security.

While the Census I project emphasised on analysing the Linux kernel distribution packages, it did not go deep into which software was utilised in production applications. Thats where Census II comes in.

In the middle of 2018, the Linux Foundation collaborated with the Laboratory for Innovation Science at Harvard University (LISH) with the objective of doing a second census to discover and analyse the extent to which open-source software is used within applications by private and public companies. This Census II thus gives a whole view of FOSS deployment by analysing usage data provided by the partner Software Composition Analysis (SCA) companies.

The Census II analysis and report from Linux Foundation published recently sheds light on the processes towards comprehending and solving structural and security complexities in the present-day supply chain in areas where open-source is present.

Linux Foundations Census II identifies the most commonly utilised free and open-source software (FOSS) parts in production apps and analyses them for potential vulnerabilities, which can inform actions to sustain the long-term security and health of FOSS.

According to Linux Foundation, there is too little data on actual FOSS deployment. Although there is public data on package downloads, software changes, and known security vulnerabilities, the record on where and how FOSS packages are being utilised is unclear.

Members of the Census II team and the Steering Committee spent months in the time leading up to the projects acquisition of data attempting to anticipate and prepare for expected obstacles and challenges to the datas use and analysis. The challenges created by the lack of a standardised naming schema for software components (that had troubled Linux Foundations Census I effort) still persisted. The naming conventions for software components across all the data contributed to the Census II effort were unique, individualised, and inconsistent.

Despite the considerable effort that went into creating the framework to produce these initial results for Census II, the challenge of applying it to other data sets with even more varied formats and naming standards still remains.

The struggles with this lack of standardised software component naming schema are not unique to the CII Census projects. The National Institute for Standards and Technology (NIST) has grappled with this issue for decades in the context of software vulnerability management.

The bottom linerevealed by the Census II project, the NTIA process, NISTs vulnerability management struggles, and other similar projectsis that there is a critical need for a standardised software component naming schema.

The next challenge and lesson learned that arose after the data had been analysed was the criticality of the security of individual developer accounts. Out of the top ten most-used software packages in analysis, the CII team found that seven were hosted under individual developer accounts. The results of such high dependence reliance upon individual programmer accounts must not be ignored. For many causes pertaining to legal, bureaucratic, and security, individual developer accounts have a few security safeguards with them than organisational accounts in a majority of instances.

While these individual accounts can use measures like multi-factor authentication (MFA), they may not always do so, and individual computing environments are probably more vulnerable to attack, finds the Linux Foundation. This means that code changes under such individual developer accounts are way easier to make, and also without much detection. And as a result, developer account takeovers have begun occurring with increasing frequency. Backdooring is one popular method used to infiltrate accounts: hackers insert malicious code into seemingly innocuous packages that create a backdoor for hackers to enter once the host package is installed.

comments

Follow this link:
What Are The Most Common Issues With Free Open Source Software? - Analytics India Magazine

Open Source Software Market : Up-To-Date Analyses Of Industry Trends And Technological Improvements 2020-2024 – News Times

Open Source SoftwareThe new report has been added by alexareports.com to give point by point understanding into the worldwide Open Source Software market.

The investigation will assist with showing signs of improvement understanding about the business contenders, a channel for the dispersion, Open Source Software development potential, possibly problematic patterns, Open Source Software industry item advancements, advertise size worth/volume (provincial/nation level, industry sections), piece of the overall industry of top players/items.______________________________________________________________________________

Download Sample Copy of the Market Report Study 2019-2024 At https://www.alexareports.com/report-sample/482004______________________________________________________________________________

Alexa Reports has procured extraordinary involvement with statistical surveying and has been creating reports offering a basic investigation of different markets with quality and precision. Our market examiners use different research techniques to offer exact and dependable data to the Open Source Software Business players to viably plan new development procedures with an intend to fortify their essence in the Open Source Software market. They additionally give different SWOT and PESTLE examinations that go about as a valuable instrument for the market members to assess various situations of the concerned market and take a further choice.

The report reviews the competitive landscape scenario seen among the top Open Source Software players, their company profile, revenue, sales, business tactics and forecast Open Source Software industry situations. According to the research, the highly competing and disparate due to global and local vendors. The global Open Source Software market report chiefly includes the following manufacturers-

The Key manufacturers that are operating in the global market are:Global Open Source Software Market Report 2020, With the slowdown in world economic growth, the Open Source Software industry has also suffered a certain impact, but still maintained a relatively optimistic growth, the past four years, Open Source Software market size to maintain the average annual growth rate of 1.90% from 3174 million $ in 2014 to 3358 million $ in 2019, our analysts believe that in the next few years, Open Source Software market size will be further expanded, we expect that by 2024, The market size of the Open Source Software will reach 3649 million $. , This Report covers the Major Players data, including: shipment, revenue, gross profit, interview record, business distribution etc., these data help the consumer know about the competitors better. This report also covers all the regions and countries of the world, which shows a regional development status, including market size., Besides, the report also covers segment data, including: type segment, industry segment, channel segment etc. cover different segment market size. Also cover different industries clients information, which is very important for the Major Players. , Section 1: FreeDefinition, Section (2 3): 1200 USDMajor Player Intel, Epson, IBM, Transcend, Oracle, Acquia, Actuate, Alfresco Software Inc, Sophos, RethinkDB, Canonical, ClearCenter, Cleversafe, Compiere Inc.,

Market Competition

The competitive landscape of the global Open Source Software market is broadly studied in the report with a large focus on recent developments, future plans of top players, and key growth strategies adopted by them. The analysts authoring the report have profiled almost every major player of the global Open Source Software market and thrown light on their crucial business aspects such as production, areas of operation, and product portfolio. All companies analyzed in the report are studied on the basis of important factors such as market share, market growth, company size, output, sales, and income.______________________________________________________________________________

Ask For [emailprotected] https://www.alexareports.com/check-discount/482004__________________________________________________________________________

Table of Content

Market by Product: This section carefully analyzes all product segments of the global market are Shareware, Bundled Software, BSD(Berkeley Source DistributionIndustry SegmPhpbb, BMForum, Phpwind.

Market by Application: Here, various application segments of the global market are taken into account for the research studies are Phpbb, BMForum, Phpwind.

Market Overview: This is the first section of the report that includes an overview of the scope of products offered in the global industry, segments by product and application, and market size.

Market Competition by Player: Here, the report shows how the competition in the global Open Source Software market is growing or decreasing based on deep analysis of market concentrate rate, competitive situations and trends, expansions, merger and acquisition deals, and other subjects. It also shows how different companies are progressing in the global Open Source Software market in terms of revenue, production, sales, and market share.

Company Profiles and Sales Data: This part of the report is very important as it gives statistical as well as other types of analysis of leading manufacturers in the global Open Source Software market. It assesses each and every player studied in the report on the basis of the main business, gross margin, revenue, sales, price, competitors, manufacturing base, product specification, product application, and product category.

Market Forecast: It starts with revenue forecast and then continues with sales, sales growth rate, and revenue growth rate forecasts of the global market. The forecasts are also provided taking into consideration product, application, and regional segments of the global market.

Upstream Raw Materials: This section includes industrial chain analysis, manufacturing cost structure analysis, and key raw materials analysis of the global Open Source Software market.______________________________________________________________________________

For Further Detailed insights and Any Query About Market, Place your Query Here!- https://www.alexareports.com/send-an-enquiry/482004_________________________________________________________________________

Marketing Strategy Analysis, Distributors: Here, the research study digs deep into behavior and other factors of downstream customers, distributors, development trends of marketing channels, and marketing channels such as indirect marketing and direct marketing.

Research Findings and Conclusion: This section is solely dedicated to the conclusion and findings of the research study on the global Open Source Software market.

Originally posted here:
Open Source Software Market : Up-To-Date Analyses Of Industry Trends And Technological Improvements 2020-2024 - News Times

SystemInfo is a simple open source system information tool for Windows – Ghacks Technology News

Most of us have used some system information tool at some point or the other to quickly analyze the devices, hardware, and software of a computer system. Tools like HwInfo display information that is useful in various situations. SystemInfo belongs to the genre and is an open source software.

The program is a portable software, so you can just download the executable and run it directly. The interface looks a lot like Piriform's Speccy, and as a matter of fact even the GitHub page for the program has a Speccy tag. Maybe it was the inspiration behind the application?

Do note that this is not a system monitoring application. If you want one of those, you can try Thilmera7 or Desktop Info, or Conky for Linux. Regardless of that, SystemInfo is quite the useful tool. You can view all of your system's hardware information on a single page. Built a new computer? Bought a new laptop? Run the program to see if everything is as it's supposed to be.

There are no settings whatsoever. It's that simple. SystemInfo lists the BIOS/EUFI name and version number, Operating System information, CPU model and clock frequency, Motherboard model number, total memory and the RAM frequency, GPU, Display, Storage drives (model number and total storage), Optical drives, Network Adapters, Sound cards and the Uptime of the computer.

I'm not certain if this is a bug, but there are a couple of issues with the application. It detected only 1 of my memory modules and showed that the laptop has 4GB of RAM. In reality, it has 2 memory chips, and Windows detects it correctly as a total of 8 gigs. Aside from this, it works pretty well and accurately detected that the system doesn't have an optical drive, and has an SSD (which I have installed using a SATA Caddy in place of the DVD drive).

SystemInfo has a built-in screenshot saving option that you can access from the File menu. Snapshots are saved in the PNG format at a location of your choice. The default file name is saved in the following format: sysinfo-capture-YYYY-M-DD_@_HH.MM.SS.

The file name contains the Year, Month, Day, Hour, minute and even the second when the screenshot was saved. You can optionally upload the screenshot to gyazo. The program offers to open the saved file in the default viewer.

If you'd rather have a text based result, you're in luck. SystemInfo can save the details in HTML, XML or TXT documents. You can also Import XML files that you have saved previously.The Hide IP (Show IP) button can be used to toggle the IP address. This is useful when you're taking a screenshot of the window, or exporting it to a file, and wish to hide your IP address from prying eyes. The program is written in C++.

The fact that its portable makes SystemInfo a useful little tool to carry on a USB Flash Drive. And since it has no options to tinker with, it's suitable for all users.

Author Rating

Software Name

SystemInfo

Operating System

Windows

Software Category

System

Price

Free

Landing Page

Follow this link:
SystemInfo is a simple open source system information tool for Windows - Ghacks Technology News

Open source companies are thriving in the cloud – ARNnet

Quick, can you spot the common link between MongoDB, DataStax, Redis Labs, Percona, Couchbase, and EnterpriseDB?

If you said, Theyre all open source database vendors, youd be mostly correct. Not all offer databases governed by an open source licence.

But if you said, Each offers an increasingly popular database-as-a-service cloud offering, youd be spot on. Indeed, while weve spent a few years with erstwhile open source vendors changing their licenses to ward off evil cloud vendors, what were starting to see is these same vendors embracing the cloud, and to hugely positive effect.

Hence, while Databricks CEO Ali Ghodsi has correctly argued that its extremely hard to manage and run a high quality managed service in the cloud and not all open source companies are good at it, its also true that more companies are figuring this out, making the next decade the era of open source databases in the cloud.

Signs, signs, everywhere the signs

Already were seeing clear indicators that open source is leaving behind its on-premises roots and heading to the cloud. Arecent Red Hat surveyfound that 95 per cent of respondents view open source as important, with use of proprietary software declining to 42 per cent (from 55 per cent the year before).

And while it may be too soon to call it a trend, 28 per cent of respondents called out Designed to work in the cloud as a key benefit of using modern open source tooling (like Kubernetes), the fourth-most cited benefit (up from eighth place last year).

Meanwhile, as more applications are born in the cloud, cloud databases have been booming. When I first started writing about this in earnest, cloud database mostly referred to databases offered by Amazon Web Services (AWS), Microsoft, and Google.

Quite quickly enterprises figured out that rather than having one massive Oracle database to run their diverse workloads, they couldinstead leverage a broader array of databases, with cloud databases increasingly central to their selections.

So much so, in fact, that in mid-2019 Gartner was ready to declare that cloud is now the default platform for managing data and that only legacy compatibility or special requirements should keep you on-premises.

This declaration, however, isnt just about databases offered by public cloud vendors. No, an interesting thing has happened to open source vendors on their way to financial success: Theyve discovered the cloud, and in a big way. Consider MongoDB, for example.

Atlas lifts MongoDB

MongoDB launched Atlas, its fully managed cloud database service, in 2016. A year later, MongoDB reported that Atlas accounted for 10 per cent of its Q4 2017 revenues.

By March 2019, Atlas revenues had surged to 34 per cent of AWS revenues, worth over $100 million in 2018. At that time, MongoDB CEO Dev Ittycheria was asked about the impact cloud database vendors were having on MongoDB.

Ittycherias response? We see no impact on a negative basis whatsoever. If anything, he said, it was raising awareness for MongoDB.

And how. In MongoDBs most recent quarter, Atlas revenue boomed by 185 per cent year-over-year, claiming 40 per cent of the companys revenue. In the earnings call, Ittycheria touted MongoDB as a cloud-first company, citing three ways in which focusing on delivering MongoDB as a fully managed cloud service has changed the company:

This calls to mind some advice Couchbase director Andy Oliver recently offered to database competitors who try to innovate open source licensing rather than offer real product innovation: Only better service, support, and innovation... will save them. Changing the open source definition wont fix what is, in the long-term, a business model problem.

Open source as-a-service

But as MongoDBs results show, creating cloud database services is possible for these current or former open source vendors.

And as difficult as it may be to create competence in operational efficacy, says Ghodsi, its the only way forward: The reality is open source software itself has zero intrinsic monetisation value because anyone can use it, so there will always be a requirement for open source vendors to determine the value beyond the software. We believe this value lies in the vendors ability to deliver open source software as a service.

As results from MongoDB, Redis Labs, DataStax, and others show, database vendors are figuring out how to be as good at operationalising software as they have been at developing software. This should give hope to would-be open source entrepreneurs that worry about how to monetise open source.

Ironically, it turns out that the open source model is the same as it ever was: charge for support. The difference, of course, is that support is baked into the product in a cloud offering.

The database future is firmly planted in the cloud, as Gartner has declared. Fortunately, open source database vendors got the message.

Error: Please check your email address.

Tags Cloudopen source

The rest is here:
Open source companies are thriving in the cloud - ARNnet

2020 Call for Code Global Challenge Led by IBM Takes On Climate Change on 75th Anniversary of United Nations – The Weather Channel

Call for Code founding partner IBM and creator David Clark Cause, in partnership with United Nations Human Rights and the Linux Foundation, announced this year's Call for Code Global Challenge on Wednesday and invited the world's software developers and innovators to help fight climate change with open source-powered technology.

On its 75th anniversary, the United Nations is demanding a "global reality check" and has launched the biggest-ever global conversation on how to address the world's most pressing issues such as climate change. Heeding the U.N.'s rallying cry to help build the future we want, IBM is joining forces with key U.N. agencies and world leaders to help tackle the climate crisis.

Following two successful years, the 2020 Call for Code Global Challenge encourages and fosters the creation of practical applications built on open source software including Red Hat OpenShift, IBM Cloud, IBM Watson, IBM Blockchain, and data from The Weather Company. The goal is to employ technology in new ways that can make an immediate and lasting humanitarian impact in communities around the world.

A recent global IBM study conducted by Morning Consult surveyed more than 3,000 developers, first responders and social activists across China, Columbia, Egypt, India, Japan, Spain, United Kingdom, and the United States, and found:

-77% of first responders and developers surveyed agree with the statement "Climate change is the single most pressing issue facing my generation."

-79% of respondents agree that climate change is something that can be reduced or combated with technology.

-87% of respondents feel it is important that a potential employer has taken action on climate change.

-Three quarters of respondents agree that the open source community can help scale climate change solutions to communities in need.

-Eight in 10 respondents agree that most people want to do something to help combat climate change, but don't know where to start.

-Over 180,000 participants from 165 nations took part in Call for Code in 2019; they created more than 5,000 applications focused on natural disaster preparedness and relief.

-This year Call for Code is challenging applicants to create innovations based on open source technologies to help halt and reverse the impact of climate change.

"There is an urgent need to take action against climate change, and IBM is uniquely positioned to connect leading humanitarian experts with the most talented and passionate developers around the world," said Bob Lord, IBM senior vice president of cognitive applications and developer ecosystems. "IBM is determined to identify, deploy, and scale technology solutions that can help save lives, empower people, and create a better world for future generations."

Lord noted that IBM has been mobilizing throughout the company, from policy commitments on climate to IBM's weather forecasting capabilities powered by AI and supercomputers.

Last years Call for Code Global Challenge winning team, Prometeo, created a wearable device that measures carbon monoxide, smoke concentration, humidity, and temperature to monitor firefighter safety in real-time as well as to help improve their health outcomes in the long-term. The solution has been developed further through IBMs Code and Response program and has just completed its first wildfire field test during a controlled burn with the Grups de Refor d'Actuacions Forestals (GRAF) and the Grup d'Emergncies Mdiques (GEM) dels Bombers de la Generalitat de Catalunya near Barcelona, Spain. Prometeo was developed by a team comprising a veteran firefighter, an emergency medical nurse, and three developers. As recently piloted, the Prometeo hardware-software solution is based on multiple IBM Cloud services.

Other applications like 2018 Call for Code winner Project Owl and 2018 Puerto Rico Call for Code hackathon winner DroneAid have also been cultivated through the Code and Response program.

Visit CallforCode.org to join the community and learn more about the challenge, which will open for submissions on March 22, World Water Day 2020. Additional details, a schedule of in-person and virtual events, and training and enablement for Call for Code will be available at https://developer.ibm.com/callforcode/.

Read the original here:
2020 Call for Code Global Challenge Led by IBM Takes On Climate Change on 75th Anniversary of United Nations - The Weather Channel