Linux Mint introduces its own take on the Chromium web browser – ZDNet

Linux Mint is a very popular Linux desktop distribution. I use the latest version, Mint 20, on my production desktops. That's partly because, while it's based on Debian Linux and Ubuntu, it takes its own path. The best example of that is Mint's excellent homebrew desktop interface, Cinnamon. Now, Mint's programmers, led by lead developer, Clement "Clem" Lefebvre, have built their own take on Google's open-source Chromium web browser.

Some of you may be saying, "Wait, haven't they offered Chromium for years? Well, yes, and no.

For years, Mint used Ubuntu's Chromium build. But then Canonical, Ubuntu's parent company, moved from releasing Chromium as an APT-compatible DEB package to a Snap.

The Ubuntu Snap software packing system, along with its rivals Flatpak and AppImage, is a new, container-oriented way of installing Linux applications. The older way of installing Linux apps, such as DEB and RPM package management systems for the Debian and Red Hat Linux families, incorporate the source code and hard-coded paths for each program.

While tried and true, these traditional packages are troublesome for developers. They require programmers to hand-craft Linux programs to work with each specific distro and its various releases. They must ensure that each program has access to specific libraries' versions. That's a lot of work and painful programming, which led to the process being given the name: Dependency hell.

Snap avoids this problem by incorporating the application and its libraries into a single package. It's then installed and mounted on a SquashFS virtual file system. When you run a Snap, you're running it inside a secured container of its own.

For Chromium, in particular, Canonical felt using Snaps was the best way to handle this program. That's because Alan Pope, Canonical's community manager for Ubuntu engineering service, explained,

Maintaining a single release of Chromium is a significant time investment for the Ubuntu Desktop Team working with the Ubuntu Security team to deliver updates to each stable release. As the teams support numerous stable releases of Ubuntu, the amount of work is compounded. Comparing this workload to other Linux distributions which have a single supported rolling release misses the nuance of supporting multiple Long Term Support (LTS) and non-LTS releases.

Google releases a new major version of Chromium every six weeks, with typically several minor versions to address security vulnerabilities in between. Every new stable version has to be built for each supported Ubuntu release 16.04, 18.04, 19.04, and the upcoming 19.10 and for all supported architectures (amd64, i386, arm, arm64).

Additionally, ensuring Chromium even builds (let alone runs) on older releases such as 16.04 can be challenging, as the upstream project often uses new compiler features that are not available on older releases.

In contrast, a Snap needs to be built only once per architecture and will run on all systems that support Snapd. This covers all supported Ubuntu releases including 14.04 with Extended Security Maintenance (ESM), as well as other distributions like Debian, Fedora, Mint, and Manjaro.'

That's all well and good, but Lefebvre disliked enormously that:

In the Ubuntu 20.04 package base, the Chromium package is indeed empty and acting, without your consent, as a backdoor by connecting your computer to the Ubuntu Store. Applications in this store cannot be patched or pinned. You can't audit them, hold them, modify them, or even point Snap to a different store. You've as much empowerment with this as if you were using proprietary software, i.e. none. This is in effect similar to a commercial proprietary solution, but with two major differences: It runs as root, and it installs itself without asking you.

So, on June 1, 2020, Mint cut Snap, and the Snap-based Chromium out of their Linux distro. Now, though, Chromium's back.

Lefebvre wrote, "The Chromium browser is now available in the official repositories for both Linux Mint and LMDE. If you've been waiting for this I'd like to thank you for your patience."

Part of the reason was, well, Canonical was right. Building Chromium from source code is one really slow process. He explained, "To guarantee reactivity and timely updates we had to automate the process of detecting, packaging and compiling new versions of Chromium. This is an application which can require more than 6 hours per build on a fast computer. We allocated a new build server with high specifications (Ryzen 9 3900, 128GB RAM, NMVe) and reduced the time it took to build Chromium to a little more than an hour." That's a lot of power!

Still, for those who love it, up-to-date builds of Chromium are now available for Mint users.

Lefebvre has always started work on an IPTV player. This is a program you can use to watch video streams from streaming services such as Mobdro, Pluto TV, and Locast. Mint already supports such open-source IPTV players as Kodi, but as Lefebvre noted, there's a "lack of good IPTV solutions on the Linux desktop but we're not sure how many people actually do use it." So, Lefebvre has built an alpha prototype, Hypnotix. If there's sufficient interest, there may eventually be an official Mint Hypnotix IPTV player, but that's a long way off from here.

Much closer are some speed and compatibility tune-ups to the Cinnamon interface. Another nice, new feature, the ability to add favorites to its Nemo file manager, has also been added.

So it is that Mint keeps improving, which is one of the big reasons I keep using it year after year.

Related Stories:

Read more from the original source:

Linux Mint introduces its own take on the Chromium web browser - ZDNet

IBM Delivered An RDi Update, Too – IT Jungle

November 4, 2020Alex Woodie

Next week, IBM is planning to deliver the PTFs that include the functionality contained in the Technology Refreshes it just unveiled for IBM i 7.3 and 7.4 last month. You will want to stay on the lookout for those updates. But some of the capabilities that IBM announced in the TRs shipped months ago, including those for Rational Developer for i (RDi).

The latest release of RDi, or version 9.6.0.8, was included as a feature for IBM i 7.3 TR9 and IBM i 7.4 TR4. IBM, of course, announced those TRs on October 6 and says it plans to deliver them on November 13. However, that particular release of RDi was actually delivered back in late April, as you can see from Doug Bidwells IBM i PTF Guide.

So while RDi 9.6.0.8 isnt exactly new, for the sake of completeness, we are giving it some virtual ink in these pages. Considering the speed at which the IBM i community sometimes adopts new technology, its quite possible that the capabilities contained in this release of RDi are new, at least to some of you.

RDi 9.6.0.8 isnt a bombshell announcement, but it does include a host of minor new capabilities and updates that were requested by RPG and COBOL developers.

At the top of the list is support for the new RPG language features that IBM introduced with IBM i 7.3 TR9 and IBM i 7.4 TR4, which we told you about last week. This includes the FOR-EACH opcode (as well as %RANGE and %LIST built-in functions); the DEBUG(*RETVAL) control keyword; the EXPROPTS keyword enhancement; and the new REQPREXP command parameter. These RPG language features are delivered in the compilers, which are included in Rational Development Studio (5770-WDS) product.

IBM is also giving users the ability to launch Access Client Solutions (ACS) from RDi without requiring a separate Java runtime environment to be installed. This update is expected to help customers avoid a certain degree of Java runtime aggravation when dealing with RDi and ACS, which has become a critical tool for accessing IBM i functions, including open source software.

This release also brings the ability to open /copy and /include files from ILE RPG source code thats stored on the IFS.

Developers that have run into problems with content assist autoformatting their SQL code will be pleased to hear that SQL is no longer autoformatted. Formatting now only occurs when the user invokes the format action, IBM says.

IBM has fixed myriad other mostly minor issues with RDi 9.6.0.8, including: the display whitespace characters that makes it hard to see RPGLE source code; incorrected values in the properties view for a referenced field; embedded CRLF sequencies in SQL not being handled by the Remote Systems LPEX editor, and problems with editing RPGLE members that reference a copy member with DBCS characters.

You can view the complete list of fixes in RDi 9.6.0.8 at this IBM website.

Tech Refresh Brings New RPG Features

Db2 And SQL Services Get Upgrades With TRs

How The Latest TRs Bolster The Core IBM i OS

Guru: RDi V9.6, Part 8 Better Ways To Copy Members, Manage LIBLs, and Find Preferences

IBM i PTF Guide, Volume 22, Number 18

Whats New with RDi Version 9.6.0.7

Tags: Tags: Access Client Solutions, ACS, COBOL, CRLF, DBCS, IBM i, ILE, Java, LPEX, Rational Developer for i, RDi, RPG, SQL

Sponsored byUCG Technologies

Best Practices for Doing IBM i Cloud Backup & DRaaS Right

In the technology business for 30+ years including more than a decade in cloud backup and disaster recovery, weve learned a few things along the way. Here are best practices to follow and land mines to avoid.

With disk-to-disk technology, your backup data resides on disk drives, proven to be far more reliable than tapes. When your backup completes, you know the data is secure and accessible on the disk drive. With tapes you never really know if your data is usable until you try to restore it, at which point its too late.

Vendor offerings vary widely. Some are designed primarily for consumers and others for enterprise data centers. Choose a solution that scales and offers the features you need to provide the level of service you expect. De-duplication and delta-block technologies will improve performance, reduce your data footprint and save you money. Find out if their de-duplication offering is at the file level or the block level. Make sure the solution can back up servers, PCs, and laptops as well your applications.

Using encryption with tape makes backups run slowly and often takes too long to fit within a backup window. As a result, most people simply turn encryption off, creating a security risk. Even with the physical safety of disk-to-disk backup, encryption is essential. Look for 256-bit AES. Find a solution that encrypts your data during transmission and storage. Make certain there isnt a back door that would let someone else view your data.

You should have direct access to your backups, with no time spent on physical transport (no trucks, no warehouses). Restores should take minutes, not hours or days. Set yourself up to work with your data, not wait for it. Make sure your solution provider can meet your Return-to-Operations (RTO) and Recovery Point Objectives (RPO) which determine how quickly you can recover your data and maintain business continuity. Inquire about onsite and offsite replication that provide both improved performance and a solid disaster recovery strategy.

Your end-users are the weak link in your network security. Today, your employees are frequently exposed to advanced phishing attacks. Trend Micro reported that 91% of successful data breaches started with a spear-phishing attack. Be sure your vendor of choice includes cyber security training as part of their backup and DR package.

You should be able to back up your data no matter how large it grows. Starting small? Look for an option that handles your backups automatically. Then, as you grow, gives you tools to manage complex environments. Look for changes-only and compression technologies to speed backups and save space. And insist on bandwidth throttling to balance traffic and ensure network availability for your other business applications. Make sure that their solution offerings rely on common technology to scale easily as your businessand datagrow.

A 2019 Server OS Reliability Survey found that one hour of downtime costs at least $100K for 98% of companies to over $5 million for 34% of surveyed companies. When you do the math, the dollars make sense: Go with disk-to-disk. Unlike tape, there are close to zero handling costsno rush deliveries, loading, accessing, locating, or repeated steps. And theres one benefit you cant factor directly: Reputation. Reliability and security can make an incalculable difference with just one avoided breach or failure.

You cant say your data protection is complete until you have a disaster recovery plan that is itself complete and tested. Your backup vendor should have both the product mix and professional services team to help you prepare for a worst-case scenario. Make sure they can help configure your backups so you rebound quickly. Best bet: A vendor who can train you to deal with disasters confidently, based on your companys actual configuration.

Visit VAULT400.com/proposal to receive a FREE analysis and proposal

LEARN MORE:

Download Solutions Brief:Best Practices for IBM i Cloud Backup & Recovery

DURING THIS UNPRECEDENTED TIME, DONT WAIT FOR YOUR BUSINESSTO SUFFER A DISASTER TO TAKE ACTION.

Visit VAULT400.com/proposal to receive a FREE analysis and proposal

Serving the US, Canada, & Latin America

VAULT400 Cloud Backup & DRaaS is an IBM Server Proven Solution.

800.211.8798 | info@ucgtechnologies.com| ucgtechnologies.com/cloud

To the First Responders serving on the front-lines during the COVID-19 pandemic,we extend our heartfelt gratitude.

Four Hundred Monitor, November 4Grafana Provides a Visualization Option for IBM i Metrics

Read the original post:

IBM Delivered An RDi Update, Too - IT Jungle

Breaking the Covenant: Researcher discovers critical flaw in open source C2 framework – The Daily Swig

Tables turned as red teaming tool gets pwned

A security researcher has turned the tables on offensive security tool Covenant by identifying a series of bugs that eventually allowed him to achieve remote code execution (RCE).

Covenant is a .NET-based command and control (C2) framework by Ryan Cobb of SpecterOps that red teamers and pen testers use for offensive cyber operations.

The open source framework features GUI, API, and plugin driven exploitation options that allow operators to interact with other offensive toolkits.

A security researcher nicknamed Coastal was able to achieve full administrative access to the C2 server via the API exposed by Covenant.

Further work allowed them to achieve remote code execution. Both the related exploits were possible without any authentication, as explained in a detailed technical write-up that explains the attack path.

Exploitation of Covenant was possible because of the exposure of secrets through the "accidental commit of an ephemeral development application settings file, as the researcher explains:

Due to an accidental commit of an ephemeral development application settings file, the secret value of all JWTs [JSON web token] issued by Covenant was locked to a single value across all deployments.

Given the project is open source, the value is known to attackers which allows them to spoof JWTs in order to gain full administrative access of any Covenant server that exposes the management interface on port 7443 (which is the default).

With administrative access, a malicious listener can be configured that, upon implant handshake, results in arbitrary code execution as the Covenant application user (root by default).

The bug was discovered and disclosed in mid-July with a temporary patch developed the same day. The embedded admin token that was the root of the issue expired on August 3.

Covenant v0.6 was released with a more permanent fix in early August, nearly three months before Coastal went public with his write-up earlier this week.

The vulnerabilities that were chained were a hard coded JWT secret that was accidentally committed to source code which allowed me to spoof administrative rights, paired with an abuse of a legitimate communications system built into the framework in order to get code execution on the server, Coastal told The Daily Swig.

The Daily Swig has contacted Cobb, the developer of the framework, with additional questions. Well update this story as and when more information comes to hand.

READ MORE Semgrep: Static code analysis tool helps eliminate entire classes of vulnerabilities

Read more:

Breaking the Covenant: Researcher discovers critical flaw in open source C2 framework - The Daily Swig

A mobile ingredient-tracing system proposed to fight food fraud – AGDAILY

The public wants to know about their food more than ever, and with a growing middle class globally and production efficiencies easing the costs of many items, more and more people are able to make food decisions based on where their food comes and other preferences that they choose to embrace. Were seeing this everywhere from farmers markets to traditional grocery stores to big-box retailers. Thats why researchers at the University of Tokyo have proposed a prototype app aimed at providing full transparency from farm to table along supply chains.

The hope with this app would be to meet the needs of small farmers, larger-scale growers, and boutique producers.

While official food certification systems exist in many countries, experts say the financial cost of implementation and the labor costs of maintenance are impractical for many farmers on the smaller-scale side. Existing certifications systems can also be exploited by unscrupulous sellers who fake certificates or logos of authenticity for premium products, like Japanese Wagyu beef and Italian Parmigiano Reggiano cheese, or for environmentally ethical products, like dolphin-safe tuna.

Our motivation was to design a food tracking system that is cheap for smallholder farmers, convenient for consumers, and can prevent food fraud, said Kaiyuan Lin, a third-year doctoral student at the University of Tokyo and first author of the research study published in Nature Food.

The research teams food tracking system begins with the harvest of any ingredient, for example, rice on a family farm. The farmer opens the app on a mobile phone, enters details about the amount and type of rice, then creates and prints a QR code to attach to the bags of rice. A truck driver then scans the QR code and enters details into the app about where, when, and how the rice was transported to the market. A market vendor buys the rice, scans the QR code to register that the rice is now in their possession, and enters details about where and how the rice is stored before resale. Eventually, the vendor might sell some rice directly to consumers or other manufacturers who can scan the QR code and know where the rice originated.

My mission is to make sure the system is not lying to you. Data are recorded in our digital system only when transactions happen person-to-person in the real, physical world, so there can be no fraud, said Lin.

If an imposter registered counterfeit QR codes to dupe consumers, farmers would notice that their alleged harvest size suddenly duplicated itself in the app. Farmers can also choose to receive updates from the app about where, when, and in what form their harvest eventually reaches consumers.

We think tracking their ingredients will appeal to farmers sense of craftsmanship and pride in their work, said Lin.

The app can also turn a long list of ingredients into a single QR code. For example, a factory chef might buy rice from Taiwan, Kampot pepper from Cambodia, and Kobe beef from Japanto manufacture into prepared meal kits. Only when physically receiving these ingredients can the factory record to their QR codes. After collecting all of the ingredients codes, the factory then uses the app to create a new QR code to attach to the completed meal kits. The factory can create a unique QR code for each new batch of meal kits every day. When consumers scan a meal kits QR code, they can read details about the kit as well as all of the origin of all the individual ingredients that are digitally connected to the kits QR code.

The app was designed with open-source software and a fully decentralized (peer-to-peer or multi-master) database, meaning that changes are not controlled by a centralized server. Data storage is spread out among every users phone or computer, so there is no central server to hack, providing consumers with even more peace of mind. Researchers hope the decentralized aspect of the app will further contribute to democratizing food systems.

For now, the app remains a hypothetical proposal in need of further financial support to become a reality.

Read more:

A mobile ingredient-tracing system proposed to fight food fraud - AGDAILY

Free speech hangs in the balance regardless of 2020 election outcome: Parler execs Wernick, Peikoff – Fox Business

Senator Roger Wicker, R-Miss, provides insight on the CEOs of Facebook, Twitter and Google testifying before the Senate.

When Tim Berners-Lee first brought the World Wide Web to life in 1991, he intended it as a free and open public square spanning the globe, decentralized, a permissionless space in which no authority would dictate what opinions may be expressed, what information may be shared, or who may associate with whom.

This was why he and others insisted that the underlying technology be based on open-source code. Had the technology been proprietary, and in my total control, it would probably not have taken off, he said. You cant propose that something be a universal space and at the same time keep control of it.

This ideal was carried into Section 230 of the Communications Decency Act of 1996, which offered substantial protection from legal liability for any company facilitating the sharing, by individuals, of content on the Internet.

Fast-forward 24years, when Congress has found it necessary to haul before it, time and again, CEOs of tech companies which have become the Internets gatekeepers.

FACEBOOK, TWITTER TO LABEL CANDIDATE POSTS DECLARING PREMATURE VICTORY ON ELECTION DAY

How did this happen? Were being told its because of that very section 230 which was intended to preserve freedom on the Internet. As a result, many are now calling for it to be modified or repealed.

Computer algorithms using an individuals personal information in order to increase his or her engagement, have vastly increased the market share of the companies that deploy them.

At the same time, these algorithms have made certain undesirable side effects of online interactionsuch as hate speech or misinformationworse. The companies are then tempted to exercise their prerogative, under Section 230, to remove an ever-increasing scope of content they in good faith deem objectionable, just to clean up their mess.

HALF OF SOCIAL MEDIA USERS IN THREE ELECTION BATTLEGROUNDS SEE ADS QUESTIONING RESULTS

Ironically and sadly, information sharing today is perhaps more subject to centralized control than it was before the Web was created. So much of our communication, information sharing, and business, is now being done online.

This is particularly true during a pandemic when the coffee chats that Berners-Lee used to say were often the most efficient way to share information, pre-Internet, are prohibited.

Instead of a bipartisan compromise which will spawn Big Brother, we should instead be revisiting Section 230, as its currently written, and interpret itin a way more consistent with its original intent.

The dominant online platforms have amassed a vast amount of personal dataan informational panopticonand they are using this data, along with the latitude afforded by the current interpretation of Section 230, to throttle the flow of information on the Internet and steer the narrative in support of their chosen beliefs.

When called out on this behavior, their approach has been merely to double-down, while hiding behind oversight boards and experts.

ADONIS HOFFMAN: BIG TECH SENATE HEARING WINNERS AND LOSERS

The situation has come to a head now, with these practices being escalated in orderto calm election-related conflict.YouTube,Twitter, andFacebookhave all joined in, implementing countless iterations of their content curation policies in the weeks leading up to the election.

These practices, especially of late, seem to work to the benefit offormer Vice PresidentJoe Biden. And, if Biden wins, this may facilitate some politicians goal to finally make social media into a public utility, and Orwells1984into an instruction manual.

But dont be fooled into thinking a win for PresidentTrump would mean a victory for free speech on the Internet.

Statist politicians of both parties would love to seize control of these companiesas well as any company that dares to compete with them.

Most of our competitors collect more data and restrict more speech more than any government could, consistent with our Constitution.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

Each side would love to seize control in a way that would uniquely benefit their own party, while also being plausibly described as an attempt to enhance freedom.

But unfortunately, what becomes more likely with each Congressional hearing, is a broad compromise between the two parties, one that will combine the worst that both parties have to offer: surveillance cronyism with a side of censorship-by-proxy.

We must be vigilant to ensure that doesnt happen, no matter who wins this week.

Just as both sides will gladly spend endless amounts of our money, both are ready to sacrifice our privacy, and our right to free speech, if it means winning the next election.

Instead of a bipartisan compromise which will spawn Big Brother, we should instead be revisiting Section 230, as its currently written, and interpret itin a way more consistent with its original intent.

Otherwise, keep government out and let the free market, in the way that only it can, revive its kindred spirits of free thought and free speech.

Jeffrey Wernick, is Strategic Investor and Chief Operating Officer for Parler.

Amy Peikoff isChief Policy Officer for Parler.

CLICK HERE TO READ MORE ON FOX BUSINESS

Follow this link:

Free speech hangs in the balance regardless of 2020 election outcome: Parler execs Wernick, Peikoff - Fox Business

The RetroBeat: The Video Game History Foundation is on the hunt for source code – VentureBeat

The video game industry is still young, but were already in danger of losing parts of its history forever. Thats why the Video Game Source Project by the Video Game History Foundation is so important.

This project is a call-to-arms for the industry to locate and preserve source code, the original art and programming that make up our beloved games. For years, keeping this source code organized was not a priority for many developers and publishers. This can make it difficult to preserve the original versions of these games. It can also hinder historians who could learn about a games development by looking through those files.

To show just how interesting and important source code is, the Video Game History Foundation is hosting a digital event today with Monkey Island creator Ron Gilbert. A $10 ticket gives you access to the live show and a recording of the event, which will pour over The Secret of Monkey Islands source code including content cut from the final game. The stream starts at 1 p.m. Pacific today, but since you get a recording of the show, youre not on a deadline.

I talked with the Video Game History Foundations co-directors, Frank Cifaldi and Kelsey Lewin, about this source code initiative and its exploration of The Secret of Monkey Island.

GamesBeat: Why did you start this Video Game Source Project?

Frank Cifaldi: We believe that there is no better way to study how a game was made than access to its source material. When we say source material, we mean anything that was used in the production of the game. That could mean source code, and it often does, but it also means things like original art that was produced, original sound and music, documentation, correspondence, just anything that survived that gives us more direct, behind the scenes access to the game. Thats going to give us so much more than the game itself could. And at the same time, this is the kind of material that is very unlikely to be accessible to people. Its something thats held as a trade secret among companies. Its something that, until recent years, was not saved and catalogued to our satisfaction. Weve already lost a lot of these source materials.

With the Video Game Source Project, its twofold. Its an attempt to call out, to the industry and to researchers, the importance of this material to telling stories. Its a callout, but its also a call to arms and a demonstration.

Kelsey Lewin: And it all goes back to our core mission, which is just that we want to see more video game history in the world. We want to see more books, more documentaries, more research being done. We dont have access to a lot of the stuff that makes that research, writing those books and making those documentaries, easy. Or even doable in a lot of cases. This is a thing that we think would help bring more interesting stories to light.

Cifaldi: This is not unheard-of in other industries. This is just how history books are written, with access to material. You dont write a biography about George Washington unless you read his archived letters. You dont write books about how a film was made unless you have access to things like its script and maybe even storyboards and any other behind the scenes materials. Its just not very common that those materials exist in an accessible space for video games. We havent solved that yet. It could be years away. But the Video Game Source Project is what we consider the first step toward getting to this future that we want to see, where its normal to be able to access this stuff instead of sort of taboo or perhaps stolen from Nintendos servers.

GamesBeat: Where are you most likely to come across source code?

Cifaldi: From the people who wrote it. We have maybe about 100 repositories right now in our archives that were almost entirely sent to us by people who were involved in the development of the game in one way or another, be it single author, or they were part of a team. That tends to be the only place that the source exists, especially for these older games. Its from people who took it home. We dont believe that many video game companies have maintained archives that extensively go back to the 80s and 90s. Especially for games of that vintage, most of them likely only survive in what we call private collections. And so part of the point of going out with this big splash that were doing with the source project is just to have this awareness campaign for people who have material like this. Hey, maybe this stuff should be somewhere. Maybe youve been holding on to it, waiting for it to make sense to donate it somewhere or give it to someone. We should start that conversation now. Weve added quite a few things to our collection because of that.

GamesBeat: Why was archiving such a low priority for these companies in the 80s and 90s?

Lewin: There wasnt really a secondary market yet. Were in an age now where there are lots of remasters and ports and that sort of thing, but back in the 80s, if you made an NES game, you put it out on the NES, and you were done. Move on to the next project. There wasnt an economic reason for companies to do this. They didnt have a reason that would make them money, to save any of this stuff.

Cifaldi: That might sound, I dont know, heartless, but thats how companies operated. Unless theres a financial reason to do something, theyre not going to do it. Even if they maybe did archive some of this material back in the day, a lot of it probably wasnt maintained. We know of some companies that are still around that did archive things in the early 90s, but its kind of stuck on an obscure tape format, and the person who backed it up doesnt work there anymore, and no one knows what software they used to back it up. Its a solvable issue, but for a company, unless they have a commercial product or some other commercial reason to go and solve those issues and read that data, theyre not going to do it. It really just comes down to, when were talking about things that are lost, its money, but its also the nature of the industry.

There have been hundreds of video game companies that have gone out of business since the industrys inception. The rights to those games may now have reverted to someone else, but its pretty unlikely in most cases that code survived and was transferred. Or things like, we know of source code being destroyed in office moves. Were moving offices, and we dont have as much storage space now, and theres this closet full of old stuff that were just going to toss, because we dont know what it is. There are so many reasons stuff gets lost. Its not unique to video games. We see similar stories throughout film preservation as well. Its just that we didnt think there was a good, organized effort to call all this out and start talking about these problems before now, so the source project is step one for us toward fixing all this.

Above: These floppies contain the source files for an NES game.

Image Credit: Video Game History Foundation

GamesBeat: What kind of a timeline are we under? How long can old media like floppy disks hold out?

Lewin: Weve lost stuff already.

Cifaldi: For sure. Nothing that was going to break your heart, but weve had floppies here that are pretty much toast. It depends on the media. We can go down a whole rabbit hole here. Im a bit worried about magnetic media from the 80s, things like floppy discs. This might sound counterintuitive, but Im extremely worried about optical backups from the early 2000s. CD-Rs and DVD-Rs. By the time you get to the early 2000s, those things are now mass-produced consumer products, and weve found that the cheap spindles of 100 discs that we used to get in 2003 were cheap, not just to us but also in the manufacturing process. I mean, its not a significant percentage, but were starting to see discs of that vintage delaminate, which theres really no recovering from. Were pretty worried about that.

But even if, physically, data survives on some formats, theres also a danger of just knowledge loss, of how to recover these things. Weve managed to recover data from obscure formats, from DAT tape backup and stuff like that. But its only been through having a network of smart people who are interested in this stuff that weve been able to do it. Im worried about losing that level of expertise, even, when it comes to this stuff, even if its on a format that could survive. It becomes more and more specialized to recover it. It gets worse as time goes on, to be able to seek out those experts and recover these as those experts age out.

Lewin: What was that compiling program that you found at a company that still owned it, and nobody in the company was able to figure out how to get you this program?

Cifaldi: I dont want to specify who it was, because we did end up finding a copy through other means. But theres one game where we had the source. We had everything but the compiler, the actual program you would run to compile to source code into an executable binary for the target system. We had the raw assets, but we couldnt make the game to play with them. We knew exactly which version number. We had the batch scripts for building the game. The batch script says, run whatever.exe. We didnt have that .exe. We found the person who wrote that compiler in 1994 or something, and he said, no, I didnt keep any of that stuff, but heres the company that owns all of it. I contacted them and talked to their customer support first. They said, oh yeah, we have all of these, we have to maintain all of those, but you have to go through sales to get this compiler. Then I spent about a week back and forth with sales trying to buy a version of their software from 16 years ago. And theres just not a path for that. Theres not anything inside of that company that will do that. There just did not seem to be a way to get that old piece of software to build this game through the company. And so we kind of gave up on that, and obtained it through other means. Which was its own strange story. But I dont want to get into exactly how it was obtained.

GamesBeat: It sounds like you have a lot of adventures.

Cifaldi: Yeah, adventures through deceased developers old hard drives. Adventures through shady Chinese piracy sites, just to get these things running again. Thats another thing that degrades as time goes on. That compiler I just talked about, the place we found it was a really weird corner of the internet. And it was the only place we could find it. If that weird corner of the internet goes away, does that program go away forever, and is this game forever unbuildable? Its actually pretty scary.

Above: The Secret of Monkey Island.

Image Credit: LucasArts

GamesBeat: Along with announcing this initiative, youre starting with this big Secret of Monkey Island showcase. Can you talk about how that collaboration came to be?

Cifaldi: We were donated a repository that, among other things, had what seemed to be the complete buildable source for The Secret of Monkey Island and Monkey Island 2: LeChucks Revenge. Im a fan of that game, and a historian, so for me this is like, cool, well, the rest of the VGHF shuts down for a week while all I do is go through this. You remember. It was like 12 straight days.

Lewin: Yeah, I decided to just not bother you for a couple of weeks. Ill just do my own thing over here.

Cifaldi: In looking at all the code and figuring out how to build the game and working with the old Lucasfilm tools and things like that, I managed to learn a reasonable amount of SCUMM, the scripting language. Im like high school freshman-fluent in SCUMM at this point. I can read it. I dont know if I can speak it or write it, but I can read it, and mostly understand what Im seeing. I learned just enough to start doing things like finding areas in the game that arent used, that arent even on the disc or anything when you buy it, because it wasnt compiled, and restore them back to their original functionality. I was able to work with our team here to reverse engineer some of their graphic formats, figure out how they work, start spitting out all the frames of information into GIFs and things like that, and find even more older, weirder content.

Essentially, I was in an archaeological dig in what might be my favorite game. I was discovering all kinds of things. I was answering all kinds of mysteries that have been around with this game. Not the secret of Monkey Island. Its not revealed in the source. Im sorry. But it confirmed a lot of things about the developments history that I doubt even the people who made it remember. I was discovering a lot, and it just seemed like something that we could do something interesting with.

We contacted [Monkey Island creator] Ron Gilbert, first of all, just to make sure that we werent doing anything that might upset or embarrass him. Were kind of digging through his garbage, you know what I mean? Looking for things he threw away. We talked to him, and then we also talked to Lucasfilm, because we didnt want to hit them over the head with it, that we were publishing content around this game. Were sort of digging through source that was donated to us. And so Lucasfilm was very receptive to what we were doing. Lets face it, Lucasfilm, theyre the Star Wars company. They understand that fans like that behind the scenes stuff. They understand that its good for the franchise if fans can talk about it in a more substantial and interesting way. I think they understood. They applied that feeling to Monkey Island and gave us a semi-official blessing to go ahead and publish content. And then Ron is a very transparent person with his development history. He had no opposition to anything we were showing. Nothing in there is strange or embarrassing, I dont think.

We thought, since the anniversary was coming up were in it right now. October 1990 is when the game came out. We dont have an actual date. I dont know where the October 15 date thats floating around came from. I can find no source for that. But we know its October. We asked Ron, would you be willing to do a livestream with us, a ticketed livestream as a fundraiser celebrating the history of this game and looking at this behind the scenes content? And he was happy to, so thats what were doing. We expected to sell maybe 100 tickets. Were at 985 right now [as of October 27]. I dont know if surprised is the right word. Delighted is in there for sure. People obviously are interested in talking about how games are made on a deeper level. The fact that we were able to sell 985 $10 tickets and counting just goes to demonstrate to us that were on the right track. Were doing the kind of work that is going to open new doors and help us talk about video game history in more interesting ways. Which, by the way, is part of our motivation too. Its sort of the unspoken part. Kelsey and I kind of get bored by traditional video game history narratives. Is that fair?

Lewin: Yeah, I think thats fair. We hear, because there are kind of a finite amount of people who historians have had traditional access to, interviews and being very open with their work, whats popular, all of those things together, we end up hearing a lot of the same stories over and over again. Its good that these stories are being told. Its not that we shouldnt be telling the story of Pac-Man or ET or whatever. But theres a lot of video games, and a lot of stories to tell.

Cifaldi: Its not like Monkey Island is uncharted territory or anything. But looking at that game through its artifacts that were left behind in the development process is something no ones ever done before. Well, really the reason that were creating content as part of the source project is that we want to inspire people to think about and investigate video game history a little differently, to start going closer to the source and being archaeologists in that way. It wasnt a big leap of logic for us, that this is what historians would want to study. Sure, you can look at other mediums and compare. But also, when we tend to talk about development history and what people study to look at that, we see two things. People are really fascinated by published screenshots and video of a game before it was done. People obsess over minor details of things like Mario 64s old HUD graphics and the placeholder audio they had. People obsess over those small details from earlier visions.

They also tend to obsess over whats still left in the final game that isnt used. Like the cutting room floor wiki is an extremely popular website, and all they do is go through shipped games and data mine them and find things to help paint that development history a little more. For us, well, what if you could get rid of those abstraction layers completely and access those files and see whats going on in places that arent in the game anywhere? We hope that this is the start of normalizing this. We hope that authors will start throwing their old code on Github or the Internet Archive, just get things out there so that people can start using that as an educational resource, and start understanding development history on a level they havent before.

GamesBeat: What was one of the cool things you found digging through Monkey Island that you can share with us?

Cifaldi: Maybe not the coolest, but a couple things that come to mind. We already teased a room in the game that isnt in the final product. That one was particularly cool because its fully fleshed out. It seems finished. It has a really great piece of animation with this severed leg dripping blood. Whats fun about it is that theres this news report from 1990 thats been on YouTube for a few years, where they visited Lucasfilm Games, and they actually filmed Ron Gilbert showing off the Secret of Monkey Island while it was still in development, and the one place they show in the game is this room no one had seen before. People were like, whats this room? Where is this? Someone asked Ron, and he didnt even remember it. He had no idea. Things got cut all the time.

Above: The cut room!

Image Credit: Video Game History Foundation

Its one of the first things I look for, because I knew it was cut content, and I found it. Its in there. This room is not important to the game. Its not this huge grand idea. Its a room connecting two rooms, and in the final game they just connect to each other. This room separates them. That in itself is something that is worth talking about.

I think that a lot of the discussion around cut content in games tends to maybe amplify this level of mystique that was never there. Because game development is so secretive, because we dont tend to get behind the scenes access to game development, we tend to think of it as being sort of mythical, when in reality its a bunch of people collaborating and making something and cutting things out because it doesnt work, or because theres no more room on the disc or whatever.

Lewin: These arent all weighty decisions that changed the narrative or changed the ideas in the game. Sometimes a cut room is just a cut room.

Cifaldi: Right. If we start acknowledging that decisions are made for reasons, that things get cut for usually the right reasons, that we can sort of move focus away from these tiny details in the games development and start talking more about the process and what made the game unique and how the systems talk to each other and how decisions were made based on our knowledge now of how the game actually works. When I play a SCUMM game now, I feel like Im in the Matrix. I understand everything thats going on under the hood now. It helps me understand why decisions were made. Why this flame over here isnt animated, why the screen scrolls in this particular way. It gives me this intimate relationship with the game that I could have any other way. Im pretty thankful for that, and Im excited for other people to experience that too.

GamesBeat: Where was this cut room?

Cifaldi: Its on Monkey Island. Its basically a connection to the cannibal village. From the overhead map, you would click on the cannibal village, but before it took you to the village, it took you through this path that upped the tension a bit. At this point in the game, all you know is theyre cannibals. You dont know that theyre goofy cannibals that arent going to harm you. Youre walking through a path and seeing gore and horror and getting scared because youre about to go to a cannibal village and they might kill you. Its just a screen thats there with no purpose other than adding tension, I think. You cant do anything except walk through it. The only interaction is that when you get near the village, you can look at it, and he says something like, I cant see anything from way back here. Thats it. Its just a room you walk through to get to the village.

I mean, this is a world thats existed in our heads for decades. Its cool to flesh it out a little more. There are parts in the code that suggest to us that the developers thought fondly of it. Its not something where theyre like, ah, kill it. I think its in the script for the overhead map. The part of the code where you click on the hotspot to go into the cut room, its still there, but its commented out, so its not compiled into the game. Then theres a comment next to it that says, in memory of the unforgettable dripping leg. Something like that. They thought fondly of this room. It could be cut for various reasons. It doesnt really do anything for the game. It just slows it down. Thats one reason. But the other reason is that the biggest use of disc space was art, and this was a ton of art. It was not just one giant room. It was also eight frames of leg dripping animation and four frames of smoke animation. It was a lot of art. It might have been something that was cut for that reason. Incidentally, I have no reason to believe, having investigated this code, that a closeup of the dog was ever a thing in the game. Its on the back of the box. I think its just a piece of art. I dont think you ever talk to the dog and get a closeup in the game. Theres no evidence to support that. I think they restored something that was never there, is my take on that.

The RetroBeat is a weekly column that looks at gamings past, diving into classics, new retro titles, or looking at how old favorites and their design techniques inspire todays market and experiences. If you have any retro-themed projects or scoops youd like to send my way, pleasecontact me.

Follow this link:

The RetroBeat: The Video Game History Foundation is on the hunt for source code - VentureBeat

Honeywell fires up the H1, its second-generation quantum computer – CNET

An ion chamber houses the qubit brains of Honeywell's quantum computers.

Honeywell's second-generation quantum computer, the H1, is in business. The powerful computer performs calculations by carefully manipulating 10 ytterbium atoms housed in a thumbnail-size package called an ion trap.

Honeywell, a surprise new entrant intoquantum computers, is one of a several companies hoping to revolutionize computing. Tech giants IBM, Google, Intel and Microsoft also have serious quantum computing programs, and startups such as Rigetti Computing and IonQ are in the fray with their own machines.

Subscribe to the CNET Now newsletter for our editors' picks of the most important stories of the day.

A host of other startups like QC Ware, Zapata, Cambridge Quantum Computing, Rahko, and Xanadu are working to make quantum computers easier to use for those that don't have a bunch of Ph.D.s on staff to wrestle with the weird laws that govern the ultra-small scale of the quantum physics realm.

The continued progress is essential if quantum computers, still in their infancy, are to meet their potential. Years of investments will be required to carry today's early designs to a more practical, profitable phase.

The heart of a quantum computer is called a qubit, a data storage and processing element that unlike conventional computer bits can store an overlapping combination of zero and one through one quantum computing phenomenon called superposition. Honeywell's H1 machine today has 10 qubits, charged ytterbium atoms arranged in a line.

Those qubits can be tickled electromagnetically to change the data they're storing, shift positions and reveal their state to the outside world when a calculation is finished. Qubits can be connected through a phenomenon called entanglement that exponentially increases the number of states a quantum computer can evaluate.

That's why quantum computers promise to be able to crack computing problems that conventional machines can't. One big expected use is molecular modeling to improve chemical processes like fertilizer manufacturing. Quantum computers are also expected to take on other materials science challenges, such as creating efficient solar panels and better batteries. Other uses focus on optimization tasks like overseeing the financial investments and routing a fleet of delivery trucks.

Honeywell pioneered this trapped-ion design with the H0 quantum computer prototype. "Because of demand from partners and customers, we transformed H0 into a commercial system," said Tony Uttley, president of Honeywell Quantum Solutions. Customers who've used H0 include Los Alamos National Laboratory and the University of Texas at Austin, oil-and-gas giant BP and financial services company JPMorgan Chase.

The H0 set a record for an IBM-designed quantum computing speed test called quantum volume, a measure that combines the number of qubits and how much useful work they can accomplish. In August, IBM reached a quantum volume of 64, part of a plan to double performance annually. But in October, Honeywell announced its H0 reached a quantum volume of 128. That's part of its plan to increase performance at least by a factor of 10 annually, reaching 640,000 by 2025.

Honeywell also detailed H2, H3, H4 and H5 quantum computer design plans extending through 2030. They'll replace today's straight-line ion trap with increasingly complicated arrangements, including a looped "racetrack" in the H2 already in testing today and increasingly large crisscrossing lattices for the H3, H4 and H5.

One big motivation for the new designs is cramming in more qubits. That'll be important to move beyond today's kicking-the-tires calculations into more serious work. It'll be essential for one of the big challenges for future quantum computers, error correction, which designers hope will let easily perturbed qubits perform calculations for longer before being derailed.

See original here:
Honeywell fires up the H1, its second-generation quantum computer - CNET

Quantum Computing Market Analysis By Market Size, Share, Revenue Growth, Development And Demand Forecast To 2028 – The Think Curiouser

According toCanadian Radio-Television and Telecommunications Commission (CRTC), the total revenues generated by telecom industry in Canada was USD 38.79 billion in 2017.

CRIFAX added a report onGlobal Quantum Computing Market, 2020-2028to its database of market research collaterals consisting of overall market scenario with prevalent and future growth prospects, among other growth strategies used by key players to stay ahead of the game. Additionally, recent trends, mergers and acquisitions, region-wise growth analysis along with challenges that are affecting the growth of the market are also stated in the report.

The increasing number of innovations and advancements in technology globally has provided various business opportunities and is predicted to drive the growth of the market over the forecast period (2019-2028). The introduction of 5G accompanied by other technologies such as digital reality comprising of Augmented Reality (AR), Virtual Reality (VR) and Mixed Reality (MR) or the fast growing Quantum Computing are setting new trends for the continuously evolving IT & Telecom industry. The total number of cellular IoT connections are anticipated to reach 3.4 billion by 2023. The globalQuantum Computing Marketis estimated to attain noticeable growth over the next 6-7 years, owing to digital transformation taking place across several services such as R&D & Testing, Information Technology (IT), Telecom and Internet. The Information & Communication Technology (ICT) goods exports recorded a growth of 11.51% in 2017 as against 11.20% in 2016. Through 5G connection, about one billion enhanced mobile broadband subscriptions are anticipated to be covered by 2023.

Get Final Sample This Strategic Report will cover the impact analysis of COVID-19 on this industry (Global and Regional Market)@https://www.crifax.com/sample-request-1009980

The global Quantum Computing market is anticipated to observe noteworthy growth in the forthcoming years, owing to increasing investments by ICT and Telecom industries in research and development activities associated with digital transformation. The United States of America is anticipated to remain as the largest telecom market and Asia Pacific is anticipated to attain highest market share in telecom sector. World Development Indicators (WDI) has placed China at the top of the rankings among the various nations according to Purchasing Power Parity (PPP), which holds 19.38% of the worlds GDP as of 2018. According to Canadian Radio-Television and Telecommunications Commission (CRTC), the Canadian telecom industry achieved a growth rate of 3.2% from 2016-2017 generating revenues of USD 38.79 billion in 2017, on account of improvement in data usage through both fixed internet as well as mobile services. Fixed internet services had an average growth rate of 7.0% by attaining revenues of USD 8.87 billion between 2016 and 2017, whereas mobile segment achieved a growth rate of 5.4% to garner revenues of USD 19.9 billion in 2017. All these factors are anticipated to drive the growth of the market over the forecast period.

Analysis Impact of COVID-19 on this industry Request For The Sample Report Here:https://www.crifax.com/sample-request-1009980

To provide better understanding of internal and external marketing factors, the multi-dimensional analytical tools such as SWOT and PESTEL analysis have been implemented in the global Quantum Computing market report. Moreover, the report consists of market segmentation, CAGR (Compound Annual Growth Rate), BPS analysis, Y-o-Y growth (%), Porters five force model, absolute $ opportunity and anticipated cost structure of the market.

About CRIFAX

CRIFAX is driven by integrity and commitment to its clients and provides cutting-edge marketing research and consulting solutions with a step-by-step guide to accomplish their business prospects. With the help of our industry experts having hands on experience in their respective domains, we make sure that our industry enthusiasts understand all the business aspects relating to their projects, which further improves the consumer base and the size of their organization. We offer wide range of unique marketing research solutions ranging from customized and syndicated research reports to consulting services, out of which, we update our syndicated research reports annually to make sure that they are modified according to the latest and ever-changing technology and industry insights. This has helped us to carve a niche in delivering distinctive business services that enhanced our global clients trust in our insights and helped us to outpace our competitors as well.

Contact Us:

CRIFAX

Email: [emailprotected]

U.K. Phone: +44 161 394 2021

U.S. Phone: +1 917 924 8284

More Related Reports:-

North America Home Security Sensors MarketNorth America Ibeacon And Bluetooth Beacon MarketNorth America Ecommerce Software And Platform MarketNorth America Data Management Platforms MarketNorth America Enterprise Manufacturing Intelligence MarketNorth America Optical Time Domain Reflectometer MarketNorth America Cloud Collaboration MarketNorth America Social Media Engagement Applications MarketNorth America Cloud Eln Service MarketNorth America Building Energy Management Solutions Market

More:
Quantum Computing Market Analysis By Market Size, Share, Revenue Growth, Development And Demand Forecast To 2028 - The Think Curiouser

Letter: First Amendment and the internet – Courier & Press

Share This Story!

Let friends in your social network know what you are reading about

Congress should take immediate steps to restore freedom of speech on the internet.

A link has been sent to your friend's email address.

A link has been posted to your Facebook feed.

OPINION

Evansville Courier & Press Published 4:25 p.m. CT Oct. 31, 2020

We are concerned with the recent editing of individuals' comments on social media. Amendment 1 to the Constitution of the United States of America reads as follows:

"Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances."

The development of the internet has provided yet another method through which free speech can be communicated and heard. The internet is no different than a newspaper, radio show or television.

Freedom of speech is supported by our laws. Today's internet providers are using their business to edit free speech. This happened recently when we published a joke on Facebook. After posting the joke Facebook locked my account. The editing of free speech continues on a national scale in politics and business. Content in whatever form should never be edited by firms that maintain virtual monopolies on the internet.

Congress should take immediate steps to restore freedom of speech on the internet.

- Barry Cox

Letter to the editor(Photo: File)

Read or Share this story: https://www.courierpress.com/story/opinion/2020/10/31/letter-first-amendment-and-internet/6107077002/

Oct. 29, 2020, 3:53 p.m.

Oct. 29, 2020, 11 p.m.

Oct. 26, 2020, 6:01 a.m.

Oct. 30, 2020, 12:07 a.m.

Oct. 30, 2020, 1:15 p.m.

Oct. 29, 2020, 11:53 p.m.

Go here to see the original:

Letter: First Amendment and the internet - Courier & Press

Censorship and loss of First Amendment rights should concern us all – Cumberland Times-News

Censorship and loss of First Amendment rights should concern us all

In reference to an Oct. 28 letter to editor from Bill Powell concerning theft of campaign sign, removing/stealing a sign from someones property is not only a crime, but violating their free speech right.

As bad as this is, I would ask Mr. Powell to look beyond this, to todays environment on social media. We used to have news organizations that reported the news, not their ideology. We now have social media, Facebook, Twitter, YouTube. These companies now put a blackout/ censorship on any point of view that they do not agree on. And sadly, many take it as the truth.

We should all be of concern of our First Amendment rights being taken away from us, no matter what side of the political fence you may stand on. Wake up people!

Gerald Davis

LaVale

Original post:

Censorship and loss of First Amendment rights should concern us all - Cumberland Times-News