Infinity is a great open-source option for browsing Reddit on Android – Up News Info

Earlier this year, I wrote about Comet, an excellent third-party Reddit app for iOS. As much as I like Comet, its an iOS-only affair and for the most part, I typically use Android. Aside from Reddits own app, there arent many cross-platform Reddit apps (Slide and Bacon Reader come to mind). While all these apps are good, one Android Reddit app, in particular, stands out above the rest: Infinity for Reddit.

What drew me to Infinity in the first place was that its open source. Over recent months, Ive found myself on a bit of an open source kick open source apps offer a few benefits over closed source apps and Ive found quite a few awesome open source alternatives to apps I use often. One of the main benefits, and one I think should be valuable to everyone, is that open source apps can often be more secure and better respect users privacy. This isnt necessarily a guarantee, but generally its true for some apps thanks to the nature of open source: anyone can look at the code and find any sketchy behaviour.

Granted, the average person likely wont look at the code, let alone understand it, but some people do. Plus, open source apps allow for other people to contribute to the project and overall brings a more community feel to software.

Open source aside, Infinity for Reddit offers several other benefits. For example one highlighted feature is Lazy Mode, which lets the app automatically handle scrolling so users can just relax and read posts as they go by. Its a neat addition, although I didnt find particularly useful. However, some people out there will definitely be fans and enjoy giving their thumbs a break.

Infinity also supports multiple Reddit accounts, something that not all apps support (and many that do lock the feature behind a premium subscription). On that note, Infinity is completely free to use and doesnt have ads. This alone makes Infinity a great option for many theres no premium tier hiding some features or capabilities.

Another bonus with Infinity is it offers fairly robust theming. Out of the box, the app offers a light theme, dark theme and support for Androids default theme, which means it can switch between light and dark mode with the rest of your phone. Theres a true black AMOLED theme as well, and several options for customer themes that you can build out and tweak based on whatever colour preference you have.

One lesser-known but certainly welcome feature of Infinity is it easily ports your data over when you switch phones. It may not be a huge benefit for most users, but its proved especially useful for me since Im often switching between devices for work. This includes not just your Reddit account and subreddits but also your app settings and custom themes. Its nice to have one less app to fiddle with and set up when booting up a new device.

With all that in mind, I highly recommend Reddit users on Android give Infinity a try. Its a solid app with excellent features, great visuals, customization and more. Plus, its completely free to use and, if youre a fan of open source, it checks that need too. Infinity is available for free on the Google Play Store. You can also check out the Infinity subreddit here (the dev is quite involved) and view the apps source code on GitHub.

Read the rest here:

Infinity is a great open-source option for browsing Reddit on Android - Up News Info

Linux Mint introduces its own take on the Chromium web browser – ZDNet

Linux Mint is a very popular Linux desktop distribution. I use the latest version, Mint 20, on my production desktops. That's partly because, while it's based on Debian Linux and Ubuntu, it takes its own path. The best example of that is Mint's excellent homebrew desktop interface, Cinnamon. Now, Mint's programmers, led by lead developer, Clement "Clem" Lefebvre, have built their own take on Google's open-source Chromium web browser.

Some of you may be saying, "Wait, haven't they offered Chromium for years? Well, yes, and no.

For years, Mint used Ubuntu's Chromium build. But then Canonical, Ubuntu's parent company, moved from releasing Chromium as an APT-compatible DEB package to a Snap.

The Ubuntu Snap software packing system, along with its rivals Flatpak and AppImage, is a new, container-oriented way of installing Linux applications. The older way of installing Linux apps, such as DEB and RPM package management systems for the Debian and Red Hat Linux families, incorporate the source code and hard-coded paths for each program.

While tried and true, these traditional packages are troublesome for developers. They require programmers to hand-craft Linux programs to work with each specific distro and its various releases. They must ensure that each program has access to specific libraries' versions. That's a lot of work and painful programming, which led to the process being given the name: Dependency hell.

Snap avoids this problem by incorporating the application and its libraries into a single package. It's then installed and mounted on a SquashFS virtual file system. When you run a Snap, you're running it inside a secured container of its own.

For Chromium, in particular, Canonical felt using Snaps was the best way to handle this program. That's because Alan Pope, Canonical's community manager for Ubuntu engineering service, explained,

Maintaining a single release of Chromium is a significant time investment for the Ubuntu Desktop Team working with the Ubuntu Security team to deliver updates to each stable release. As the teams support numerous stable releases of Ubuntu, the amount of work is compounded. Comparing this workload to other Linux distributions which have a single supported rolling release misses the nuance of supporting multiple Long Term Support (LTS) and non-LTS releases.

Google releases a new major version of Chromium every six weeks, with typically several minor versions to address security vulnerabilities in between. Every new stable version has to be built for each supported Ubuntu release 16.04, 18.04, 19.04, and the upcoming 19.10 and for all supported architectures (amd64, i386, arm, arm64).

Additionally, ensuring Chromium even builds (let alone runs) on older releases such as 16.04 can be challenging, as the upstream project often uses new compiler features that are not available on older releases.

In contrast, a Snap needs to be built only once per architecture and will run on all systems that support Snapd. This covers all supported Ubuntu releases including 14.04 with Extended Security Maintenance (ESM), as well as other distributions like Debian, Fedora, Mint, and Manjaro.'

That's all well and good, but Lefebvre disliked enormously that:

In the Ubuntu 20.04 package base, the Chromium package is indeed empty and acting, without your consent, as a backdoor by connecting your computer to the Ubuntu Store. Applications in this store cannot be patched or pinned. You can't audit them, hold them, modify them, or even point Snap to a different store. You've as much empowerment with this as if you were using proprietary software, i.e. none. This is in effect similar to a commercial proprietary solution, but with two major differences: It runs as root, and it installs itself without asking you.

So, on June 1, 2020, Mint cut Snap, and the Snap-based Chromium out of their Linux distro. Now, though, Chromium's back.

Lefebvre wrote, "The Chromium browser is now available in the official repositories for both Linux Mint and LMDE. If you've been waiting for this I'd like to thank you for your patience."

Part of the reason was, well, Canonical was right. Building Chromium from source code is one really slow process. He explained, "To guarantee reactivity and timely updates we had to automate the process of detecting, packaging and compiling new versions of Chromium. This is an application which can require more than 6 hours per build on a fast computer. We allocated a new build server with high specifications (Ryzen 9 3900, 128GB RAM, NMVe) and reduced the time it took to build Chromium to a little more than an hour." That's a lot of power!

Still, for those who love it, up-to-date builds of Chromium are now available for Mint users.

Lefebvre has always started work on an IPTV player. This is a program you can use to watch video streams from streaming services such as Mobdro, Pluto TV, and Locast. Mint already supports such open-source IPTV players as Kodi, but as Lefebvre noted, there's a "lack of good IPTV solutions on the Linux desktop but we're not sure how many people actually do use it." So, Lefebvre has built an alpha prototype, Hypnotix. If there's sufficient interest, there may eventually be an official Mint Hypnotix IPTV player, but that's a long way off from here.

Much closer are some speed and compatibility tune-ups to the Cinnamon interface. Another nice, new feature, the ability to add favorites to its Nemo file manager, has also been added.

So it is that Mint keeps improving, which is one of the big reasons I keep using it year after year.

Related Stories:

Read more from the original source:

Linux Mint introduces its own take on the Chromium web browser - ZDNet

Breaking the Covenant: Researcher discovers critical flaw in open source C2 framework – The Daily Swig

Tables turned as red teaming tool gets pwned

A security researcher has turned the tables on offensive security tool Covenant by identifying a series of bugs that eventually allowed him to achieve remote code execution (RCE).

Covenant is a .NET-based command and control (C2) framework by Ryan Cobb of SpecterOps that red teamers and pen testers use for offensive cyber operations.

The open source framework features GUI, API, and plugin driven exploitation options that allow operators to interact with other offensive toolkits.

A security researcher nicknamed Coastal was able to achieve full administrative access to the C2 server via the API exposed by Covenant.

Further work allowed them to achieve remote code execution. Both the related exploits were possible without any authentication, as explained in a detailed technical write-up that explains the attack path.

Exploitation of Covenant was possible because of the exposure of secrets through the "accidental commit of an ephemeral development application settings file, as the researcher explains:

Due to an accidental commit of an ephemeral development application settings file, the secret value of all JWTs [JSON web token] issued by Covenant was locked to a single value across all deployments.

Given the project is open source, the value is known to attackers which allows them to spoof JWTs in order to gain full administrative access of any Covenant server that exposes the management interface on port 7443 (which is the default).

With administrative access, a malicious listener can be configured that, upon implant handshake, results in arbitrary code execution as the Covenant application user (root by default).

The bug was discovered and disclosed in mid-July with a temporary patch developed the same day. The embedded admin token that was the root of the issue expired on August 3.

Covenant v0.6 was released with a more permanent fix in early August, nearly three months before Coastal went public with his write-up earlier this week.

The vulnerabilities that were chained were a hard coded JWT secret that was accidentally committed to source code which allowed me to spoof administrative rights, paired with an abuse of a legitimate communications system built into the framework in order to get code execution on the server, Coastal told The Daily Swig.

The Daily Swig has contacted Cobb, the developer of the framework, with additional questions. Well update this story as and when more information comes to hand.

READ MORE Semgrep: Static code analysis tool helps eliminate entire classes of vulnerabilities

Read more:

Breaking the Covenant: Researcher discovers critical flaw in open source C2 framework - The Daily Swig

IBM Delivered An RDi Update, Too – IT Jungle

November 4, 2020Alex Woodie

Next week, IBM is planning to deliver the PTFs that include the functionality contained in the Technology Refreshes it just unveiled for IBM i 7.3 and 7.4 last month. You will want to stay on the lookout for those updates. But some of the capabilities that IBM announced in the TRs shipped months ago, including those for Rational Developer for i (RDi).

The latest release of RDi, or version 9.6.0.8, was included as a feature for IBM i 7.3 TR9 and IBM i 7.4 TR4. IBM, of course, announced those TRs on October 6 and says it plans to deliver them on November 13. However, that particular release of RDi was actually delivered back in late April, as you can see from Doug Bidwells IBM i PTF Guide.

So while RDi 9.6.0.8 isnt exactly new, for the sake of completeness, we are giving it some virtual ink in these pages. Considering the speed at which the IBM i community sometimes adopts new technology, its quite possible that the capabilities contained in this release of RDi are new, at least to some of you.

RDi 9.6.0.8 isnt a bombshell announcement, but it does include a host of minor new capabilities and updates that were requested by RPG and COBOL developers.

At the top of the list is support for the new RPG language features that IBM introduced with IBM i 7.3 TR9 and IBM i 7.4 TR4, which we told you about last week. This includes the FOR-EACH opcode (as well as %RANGE and %LIST built-in functions); the DEBUG(*RETVAL) control keyword; the EXPROPTS keyword enhancement; and the new REQPREXP command parameter. These RPG language features are delivered in the compilers, which are included in Rational Development Studio (5770-WDS) product.

IBM is also giving users the ability to launch Access Client Solutions (ACS) from RDi without requiring a separate Java runtime environment to be installed. This update is expected to help customers avoid a certain degree of Java runtime aggravation when dealing with RDi and ACS, which has become a critical tool for accessing IBM i functions, including open source software.

This release also brings the ability to open /copy and /include files from ILE RPG source code thats stored on the IFS.

Developers that have run into problems with content assist autoformatting their SQL code will be pleased to hear that SQL is no longer autoformatted. Formatting now only occurs when the user invokes the format action, IBM says.

IBM has fixed myriad other mostly minor issues with RDi 9.6.0.8, including: the display whitespace characters that makes it hard to see RPGLE source code; incorrected values in the properties view for a referenced field; embedded CRLF sequencies in SQL not being handled by the Remote Systems LPEX editor, and problems with editing RPGLE members that reference a copy member with DBCS characters.

You can view the complete list of fixes in RDi 9.6.0.8 at this IBM website.

Tech Refresh Brings New RPG Features

Db2 And SQL Services Get Upgrades With TRs

How The Latest TRs Bolster The Core IBM i OS

Guru: RDi V9.6, Part 8 Better Ways To Copy Members, Manage LIBLs, and Find Preferences

IBM i PTF Guide, Volume 22, Number 18

Whats New with RDi Version 9.6.0.7

Tags: Tags: Access Client Solutions, ACS, COBOL, CRLF, DBCS, IBM i, ILE, Java, LPEX, Rational Developer for i, RDi, RPG, SQL

Sponsored byUCG Technologies

Best Practices for Doing IBM i Cloud Backup & DRaaS Right

In the technology business for 30+ years including more than a decade in cloud backup and disaster recovery, weve learned a few things along the way. Here are best practices to follow and land mines to avoid.

With disk-to-disk technology, your backup data resides on disk drives, proven to be far more reliable than tapes. When your backup completes, you know the data is secure and accessible on the disk drive. With tapes you never really know if your data is usable until you try to restore it, at which point its too late.

Vendor offerings vary widely. Some are designed primarily for consumers and others for enterprise data centers. Choose a solution that scales and offers the features you need to provide the level of service you expect. De-duplication and delta-block technologies will improve performance, reduce your data footprint and save you money. Find out if their de-duplication offering is at the file level or the block level. Make sure the solution can back up servers, PCs, and laptops as well your applications.

Using encryption with tape makes backups run slowly and often takes too long to fit within a backup window. As a result, most people simply turn encryption off, creating a security risk. Even with the physical safety of disk-to-disk backup, encryption is essential. Look for 256-bit AES. Find a solution that encrypts your data during transmission and storage. Make certain there isnt a back door that would let someone else view your data.

You should have direct access to your backups, with no time spent on physical transport (no trucks, no warehouses). Restores should take minutes, not hours or days. Set yourself up to work with your data, not wait for it. Make sure your solution provider can meet your Return-to-Operations (RTO) and Recovery Point Objectives (RPO) which determine how quickly you can recover your data and maintain business continuity. Inquire about onsite and offsite replication that provide both improved performance and a solid disaster recovery strategy.

Your end-users are the weak link in your network security. Today, your employees are frequently exposed to advanced phishing attacks. Trend Micro reported that 91% of successful data breaches started with a spear-phishing attack. Be sure your vendor of choice includes cyber security training as part of their backup and DR package.

You should be able to back up your data no matter how large it grows. Starting small? Look for an option that handles your backups automatically. Then, as you grow, gives you tools to manage complex environments. Look for changes-only and compression technologies to speed backups and save space. And insist on bandwidth throttling to balance traffic and ensure network availability for your other business applications. Make sure that their solution offerings rely on common technology to scale easily as your businessand datagrow.

A 2019 Server OS Reliability Survey found that one hour of downtime costs at least $100K for 98% of companies to over $5 million for 34% of surveyed companies. When you do the math, the dollars make sense: Go with disk-to-disk. Unlike tape, there are close to zero handling costsno rush deliveries, loading, accessing, locating, or repeated steps. And theres one benefit you cant factor directly: Reputation. Reliability and security can make an incalculable difference with just one avoided breach or failure.

You cant say your data protection is complete until you have a disaster recovery plan that is itself complete and tested. Your backup vendor should have both the product mix and professional services team to help you prepare for a worst-case scenario. Make sure they can help configure your backups so you rebound quickly. Best bet: A vendor who can train you to deal with disasters confidently, based on your companys actual configuration.

Visit VAULT400.com/proposal to receive a FREE analysis and proposal

LEARN MORE:

Download Solutions Brief:Best Practices for IBM i Cloud Backup & Recovery

DURING THIS UNPRECEDENTED TIME, DONT WAIT FOR YOUR BUSINESSTO SUFFER A DISASTER TO TAKE ACTION.

Visit VAULT400.com/proposal to receive a FREE analysis and proposal

Serving the US, Canada, & Latin America

VAULT400 Cloud Backup & DRaaS is an IBM Server Proven Solution.

800.211.8798 | info@ucgtechnologies.com| ucgtechnologies.com/cloud

To the First Responders serving on the front-lines during the COVID-19 pandemic,we extend our heartfelt gratitude.

Four Hundred Monitor, November 4Grafana Provides a Visualization Option for IBM i Metrics

Read the original post:

IBM Delivered An RDi Update, Too - IT Jungle

Free speech hangs in the balance regardless of 2020 election outcome: Parler execs Wernick, Peikoff – Fox Business

Senator Roger Wicker, R-Miss, provides insight on the CEOs of Facebook, Twitter and Google testifying before the Senate.

When Tim Berners-Lee first brought the World Wide Web to life in 1991, he intended it as a free and open public square spanning the globe, decentralized, a permissionless space in which no authority would dictate what opinions may be expressed, what information may be shared, or who may associate with whom.

This was why he and others insisted that the underlying technology be based on open-source code. Had the technology been proprietary, and in my total control, it would probably not have taken off, he said. You cant propose that something be a universal space and at the same time keep control of it.

This ideal was carried into Section 230 of the Communications Decency Act of 1996, which offered substantial protection from legal liability for any company facilitating the sharing, by individuals, of content on the Internet.

Fast-forward 24years, when Congress has found it necessary to haul before it, time and again, CEOs of tech companies which have become the Internets gatekeepers.

FACEBOOK, TWITTER TO LABEL CANDIDATE POSTS DECLARING PREMATURE VICTORY ON ELECTION DAY

How did this happen? Were being told its because of that very section 230 which was intended to preserve freedom on the Internet. As a result, many are now calling for it to be modified or repealed.

Computer algorithms using an individuals personal information in order to increase his or her engagement, have vastly increased the market share of the companies that deploy them.

At the same time, these algorithms have made certain undesirable side effects of online interactionsuch as hate speech or misinformationworse. The companies are then tempted to exercise their prerogative, under Section 230, to remove an ever-increasing scope of content they in good faith deem objectionable, just to clean up their mess.

HALF OF SOCIAL MEDIA USERS IN THREE ELECTION BATTLEGROUNDS SEE ADS QUESTIONING RESULTS

Ironically and sadly, information sharing today is perhaps more subject to centralized control than it was before the Web was created. So much of our communication, information sharing, and business, is now being done online.

This is particularly true during a pandemic when the coffee chats that Berners-Lee used to say were often the most efficient way to share information, pre-Internet, are prohibited.

Instead of a bipartisan compromise which will spawn Big Brother, we should instead be revisiting Section 230, as its currently written, and interpret itin a way more consistent with its original intent.

The dominant online platforms have amassed a vast amount of personal dataan informational panopticonand they are using this data, along with the latitude afforded by the current interpretation of Section 230, to throttle the flow of information on the Internet and steer the narrative in support of their chosen beliefs.

When called out on this behavior, their approach has been merely to double-down, while hiding behind oversight boards and experts.

ADONIS HOFFMAN: BIG TECH SENATE HEARING WINNERS AND LOSERS

The situation has come to a head now, with these practices being escalated in orderto calm election-related conflict.YouTube,Twitter, andFacebookhave all joined in, implementing countless iterations of their content curation policies in the weeks leading up to the election.

These practices, especially of late, seem to work to the benefit offormer Vice PresidentJoe Biden. And, if Biden wins, this may facilitate some politicians goal to finally make social media into a public utility, and Orwells1984into an instruction manual.

But dont be fooled into thinking a win for PresidentTrump would mean a victory for free speech on the Internet.

Statist politicians of both parties would love to seize control of these companiesas well as any company that dares to compete with them.

Most of our competitors collect more data and restrict more speech more than any government could, consistent with our Constitution.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

Each side would love to seize control in a way that would uniquely benefit their own party, while also being plausibly described as an attempt to enhance freedom.

But unfortunately, what becomes more likely with each Congressional hearing, is a broad compromise between the two parties, one that will combine the worst that both parties have to offer: surveillance cronyism with a side of censorship-by-proxy.

We must be vigilant to ensure that doesnt happen, no matter who wins this week.

Just as both sides will gladly spend endless amounts of our money, both are ready to sacrifice our privacy, and our right to free speech, if it means winning the next election.

Instead of a bipartisan compromise which will spawn Big Brother, we should instead be revisiting Section 230, as its currently written, and interpret itin a way more consistent with its original intent.

Otherwise, keep government out and let the free market, in the way that only it can, revive its kindred spirits of free thought and free speech.

Jeffrey Wernick, is Strategic Investor and Chief Operating Officer for Parler.

Amy Peikoff isChief Policy Officer for Parler.

CLICK HERE TO READ MORE ON FOX BUSINESS

Follow this link:

Free speech hangs in the balance regardless of 2020 election outcome: Parler execs Wernick, Peikoff - Fox Business

A mobile ingredient-tracing system proposed to fight food fraud – AGDAILY

The public wants to know about their food more than ever, and with a growing middle class globally and production efficiencies easing the costs of many items, more and more people are able to make food decisions based on where their food comes and other preferences that they choose to embrace. Were seeing this everywhere from farmers markets to traditional grocery stores to big-box retailers. Thats why researchers at the University of Tokyo have proposed a prototype app aimed at providing full transparency from farm to table along supply chains.

The hope with this app would be to meet the needs of small farmers, larger-scale growers, and boutique producers.

While official food certification systems exist in many countries, experts say the financial cost of implementation and the labor costs of maintenance are impractical for many farmers on the smaller-scale side. Existing certifications systems can also be exploited by unscrupulous sellers who fake certificates or logos of authenticity for premium products, like Japanese Wagyu beef and Italian Parmigiano Reggiano cheese, or for environmentally ethical products, like dolphin-safe tuna.

Our motivation was to design a food tracking system that is cheap for smallholder farmers, convenient for consumers, and can prevent food fraud, said Kaiyuan Lin, a third-year doctoral student at the University of Tokyo and first author of the research study published in Nature Food.

The research teams food tracking system begins with the harvest of any ingredient, for example, rice on a family farm. The farmer opens the app on a mobile phone, enters details about the amount and type of rice, then creates and prints a QR code to attach to the bags of rice. A truck driver then scans the QR code and enters details into the app about where, when, and how the rice was transported to the market. A market vendor buys the rice, scans the QR code to register that the rice is now in their possession, and enters details about where and how the rice is stored before resale. Eventually, the vendor might sell some rice directly to consumers or other manufacturers who can scan the QR code and know where the rice originated.

My mission is to make sure the system is not lying to you. Data are recorded in our digital system only when transactions happen person-to-person in the real, physical world, so there can be no fraud, said Lin.

If an imposter registered counterfeit QR codes to dupe consumers, farmers would notice that their alleged harvest size suddenly duplicated itself in the app. Farmers can also choose to receive updates from the app about where, when, and in what form their harvest eventually reaches consumers.

We think tracking their ingredients will appeal to farmers sense of craftsmanship and pride in their work, said Lin.

The app can also turn a long list of ingredients into a single QR code. For example, a factory chef might buy rice from Taiwan, Kampot pepper from Cambodia, and Kobe beef from Japanto manufacture into prepared meal kits. Only when physically receiving these ingredients can the factory record to their QR codes. After collecting all of the ingredients codes, the factory then uses the app to create a new QR code to attach to the completed meal kits. The factory can create a unique QR code for each new batch of meal kits every day. When consumers scan a meal kits QR code, they can read details about the kit as well as all of the origin of all the individual ingredients that are digitally connected to the kits QR code.

The app was designed with open-source software and a fully decentralized (peer-to-peer or multi-master) database, meaning that changes are not controlled by a centralized server. Data storage is spread out among every users phone or computer, so there is no central server to hack, providing consumers with even more peace of mind. Researchers hope the decentralized aspect of the app will further contribute to democratizing food systems.

For now, the app remains a hypothetical proposal in need of further financial support to become a reality.

Read more:

A mobile ingredient-tracing system proposed to fight food fraud - AGDAILY

The RetroBeat: The Video Game History Foundation is on the hunt for source code – VentureBeat

The video game industry is still young, but were already in danger of losing parts of its history forever. Thats why the Video Game Source Project by the Video Game History Foundation is so important.

This project is a call-to-arms for the industry to locate and preserve source code, the original art and programming that make up our beloved games. For years, keeping this source code organized was not a priority for many developers and publishers. This can make it difficult to preserve the original versions of these games. It can also hinder historians who could learn about a games development by looking through those files.

To show just how interesting and important source code is, the Video Game History Foundation is hosting a digital event today with Monkey Island creator Ron Gilbert. A $10 ticket gives you access to the live show and a recording of the event, which will pour over The Secret of Monkey Islands source code including content cut from the final game. The stream starts at 1 p.m. Pacific today, but since you get a recording of the show, youre not on a deadline.

I talked with the Video Game History Foundations co-directors, Frank Cifaldi and Kelsey Lewin, about this source code initiative and its exploration of The Secret of Monkey Island.

GamesBeat: Why did you start this Video Game Source Project?

Frank Cifaldi: We believe that there is no better way to study how a game was made than access to its source material. When we say source material, we mean anything that was used in the production of the game. That could mean source code, and it often does, but it also means things like original art that was produced, original sound and music, documentation, correspondence, just anything that survived that gives us more direct, behind the scenes access to the game. Thats going to give us so much more than the game itself could. And at the same time, this is the kind of material that is very unlikely to be accessible to people. Its something thats held as a trade secret among companies. Its something that, until recent years, was not saved and catalogued to our satisfaction. Weve already lost a lot of these source materials.

With the Video Game Source Project, its twofold. Its an attempt to call out, to the industry and to researchers, the importance of this material to telling stories. Its a callout, but its also a call to arms and a demonstration.

Kelsey Lewin: And it all goes back to our core mission, which is just that we want to see more video game history in the world. We want to see more books, more documentaries, more research being done. We dont have access to a lot of the stuff that makes that research, writing those books and making those documentaries, easy. Or even doable in a lot of cases. This is a thing that we think would help bring more interesting stories to light.

Cifaldi: This is not unheard-of in other industries. This is just how history books are written, with access to material. You dont write a biography about George Washington unless you read his archived letters. You dont write books about how a film was made unless you have access to things like its script and maybe even storyboards and any other behind the scenes materials. Its just not very common that those materials exist in an accessible space for video games. We havent solved that yet. It could be years away. But the Video Game Source Project is what we consider the first step toward getting to this future that we want to see, where its normal to be able to access this stuff instead of sort of taboo or perhaps stolen from Nintendos servers.

GamesBeat: Where are you most likely to come across source code?

Cifaldi: From the people who wrote it. We have maybe about 100 repositories right now in our archives that were almost entirely sent to us by people who were involved in the development of the game in one way or another, be it single author, or they were part of a team. That tends to be the only place that the source exists, especially for these older games. Its from people who took it home. We dont believe that many video game companies have maintained archives that extensively go back to the 80s and 90s. Especially for games of that vintage, most of them likely only survive in what we call private collections. And so part of the point of going out with this big splash that were doing with the source project is just to have this awareness campaign for people who have material like this. Hey, maybe this stuff should be somewhere. Maybe youve been holding on to it, waiting for it to make sense to donate it somewhere or give it to someone. We should start that conversation now. Weve added quite a few things to our collection because of that.

GamesBeat: Why was archiving such a low priority for these companies in the 80s and 90s?

Lewin: There wasnt really a secondary market yet. Were in an age now where there are lots of remasters and ports and that sort of thing, but back in the 80s, if you made an NES game, you put it out on the NES, and you were done. Move on to the next project. There wasnt an economic reason for companies to do this. They didnt have a reason that would make them money, to save any of this stuff.

Cifaldi: That might sound, I dont know, heartless, but thats how companies operated. Unless theres a financial reason to do something, theyre not going to do it. Even if they maybe did archive some of this material back in the day, a lot of it probably wasnt maintained. We know of some companies that are still around that did archive things in the early 90s, but its kind of stuck on an obscure tape format, and the person who backed it up doesnt work there anymore, and no one knows what software they used to back it up. Its a solvable issue, but for a company, unless they have a commercial product or some other commercial reason to go and solve those issues and read that data, theyre not going to do it. It really just comes down to, when were talking about things that are lost, its money, but its also the nature of the industry.

There have been hundreds of video game companies that have gone out of business since the industrys inception. The rights to those games may now have reverted to someone else, but its pretty unlikely in most cases that code survived and was transferred. Or things like, we know of source code being destroyed in office moves. Were moving offices, and we dont have as much storage space now, and theres this closet full of old stuff that were just going to toss, because we dont know what it is. There are so many reasons stuff gets lost. Its not unique to video games. We see similar stories throughout film preservation as well. Its just that we didnt think there was a good, organized effort to call all this out and start talking about these problems before now, so the source project is step one for us toward fixing all this.

Above: These floppies contain the source files for an NES game.

Image Credit: Video Game History Foundation

GamesBeat: What kind of a timeline are we under? How long can old media like floppy disks hold out?

Lewin: Weve lost stuff already.

Cifaldi: For sure. Nothing that was going to break your heart, but weve had floppies here that are pretty much toast. It depends on the media. We can go down a whole rabbit hole here. Im a bit worried about magnetic media from the 80s, things like floppy discs. This might sound counterintuitive, but Im extremely worried about optical backups from the early 2000s. CD-Rs and DVD-Rs. By the time you get to the early 2000s, those things are now mass-produced consumer products, and weve found that the cheap spindles of 100 discs that we used to get in 2003 were cheap, not just to us but also in the manufacturing process. I mean, its not a significant percentage, but were starting to see discs of that vintage delaminate, which theres really no recovering from. Were pretty worried about that.

But even if, physically, data survives on some formats, theres also a danger of just knowledge loss, of how to recover these things. Weve managed to recover data from obscure formats, from DAT tape backup and stuff like that. But its only been through having a network of smart people who are interested in this stuff that weve been able to do it. Im worried about losing that level of expertise, even, when it comes to this stuff, even if its on a format that could survive. It becomes more and more specialized to recover it. It gets worse as time goes on, to be able to seek out those experts and recover these as those experts age out.

Lewin: What was that compiling program that you found at a company that still owned it, and nobody in the company was able to figure out how to get you this program?

Cifaldi: I dont want to specify who it was, because we did end up finding a copy through other means. But theres one game where we had the source. We had everything but the compiler, the actual program you would run to compile to source code into an executable binary for the target system. We had the raw assets, but we couldnt make the game to play with them. We knew exactly which version number. We had the batch scripts for building the game. The batch script says, run whatever.exe. We didnt have that .exe. We found the person who wrote that compiler in 1994 or something, and he said, no, I didnt keep any of that stuff, but heres the company that owns all of it. I contacted them and talked to their customer support first. They said, oh yeah, we have all of these, we have to maintain all of those, but you have to go through sales to get this compiler. Then I spent about a week back and forth with sales trying to buy a version of their software from 16 years ago. And theres just not a path for that. Theres not anything inside of that company that will do that. There just did not seem to be a way to get that old piece of software to build this game through the company. And so we kind of gave up on that, and obtained it through other means. Which was its own strange story. But I dont want to get into exactly how it was obtained.

GamesBeat: It sounds like you have a lot of adventures.

Cifaldi: Yeah, adventures through deceased developers old hard drives. Adventures through shady Chinese piracy sites, just to get these things running again. Thats another thing that degrades as time goes on. That compiler I just talked about, the place we found it was a really weird corner of the internet. And it was the only place we could find it. If that weird corner of the internet goes away, does that program go away forever, and is this game forever unbuildable? Its actually pretty scary.

Above: The Secret of Monkey Island.

Image Credit: LucasArts

GamesBeat: Along with announcing this initiative, youre starting with this big Secret of Monkey Island showcase. Can you talk about how that collaboration came to be?

Cifaldi: We were donated a repository that, among other things, had what seemed to be the complete buildable source for The Secret of Monkey Island and Monkey Island 2: LeChucks Revenge. Im a fan of that game, and a historian, so for me this is like, cool, well, the rest of the VGHF shuts down for a week while all I do is go through this. You remember. It was like 12 straight days.

Lewin: Yeah, I decided to just not bother you for a couple of weeks. Ill just do my own thing over here.

Cifaldi: In looking at all the code and figuring out how to build the game and working with the old Lucasfilm tools and things like that, I managed to learn a reasonable amount of SCUMM, the scripting language. Im like high school freshman-fluent in SCUMM at this point. I can read it. I dont know if I can speak it or write it, but I can read it, and mostly understand what Im seeing. I learned just enough to start doing things like finding areas in the game that arent used, that arent even on the disc or anything when you buy it, because it wasnt compiled, and restore them back to their original functionality. I was able to work with our team here to reverse engineer some of their graphic formats, figure out how they work, start spitting out all the frames of information into GIFs and things like that, and find even more older, weirder content.

Essentially, I was in an archaeological dig in what might be my favorite game. I was discovering all kinds of things. I was answering all kinds of mysteries that have been around with this game. Not the secret of Monkey Island. Its not revealed in the source. Im sorry. But it confirmed a lot of things about the developments history that I doubt even the people who made it remember. I was discovering a lot, and it just seemed like something that we could do something interesting with.

We contacted [Monkey Island creator] Ron Gilbert, first of all, just to make sure that we werent doing anything that might upset or embarrass him. Were kind of digging through his garbage, you know what I mean? Looking for things he threw away. We talked to him, and then we also talked to Lucasfilm, because we didnt want to hit them over the head with it, that we were publishing content around this game. Were sort of digging through source that was donated to us. And so Lucasfilm was very receptive to what we were doing. Lets face it, Lucasfilm, theyre the Star Wars company. They understand that fans like that behind the scenes stuff. They understand that its good for the franchise if fans can talk about it in a more substantial and interesting way. I think they understood. They applied that feeling to Monkey Island and gave us a semi-official blessing to go ahead and publish content. And then Ron is a very transparent person with his development history. He had no opposition to anything we were showing. Nothing in there is strange or embarrassing, I dont think.

We thought, since the anniversary was coming up were in it right now. October 1990 is when the game came out. We dont have an actual date. I dont know where the October 15 date thats floating around came from. I can find no source for that. But we know its October. We asked Ron, would you be willing to do a livestream with us, a ticketed livestream as a fundraiser celebrating the history of this game and looking at this behind the scenes content? And he was happy to, so thats what were doing. We expected to sell maybe 100 tickets. Were at 985 right now [as of October 27]. I dont know if surprised is the right word. Delighted is in there for sure. People obviously are interested in talking about how games are made on a deeper level. The fact that we were able to sell 985 $10 tickets and counting just goes to demonstrate to us that were on the right track. Were doing the kind of work that is going to open new doors and help us talk about video game history in more interesting ways. Which, by the way, is part of our motivation too. Its sort of the unspoken part. Kelsey and I kind of get bored by traditional video game history narratives. Is that fair?

Lewin: Yeah, I think thats fair. We hear, because there are kind of a finite amount of people who historians have had traditional access to, interviews and being very open with their work, whats popular, all of those things together, we end up hearing a lot of the same stories over and over again. Its good that these stories are being told. Its not that we shouldnt be telling the story of Pac-Man or ET or whatever. But theres a lot of video games, and a lot of stories to tell.

Cifaldi: Its not like Monkey Island is uncharted territory or anything. But looking at that game through its artifacts that were left behind in the development process is something no ones ever done before. Well, really the reason that were creating content as part of the source project is that we want to inspire people to think about and investigate video game history a little differently, to start going closer to the source and being archaeologists in that way. It wasnt a big leap of logic for us, that this is what historians would want to study. Sure, you can look at other mediums and compare. But also, when we tend to talk about development history and what people study to look at that, we see two things. People are really fascinated by published screenshots and video of a game before it was done. People obsess over minor details of things like Mario 64s old HUD graphics and the placeholder audio they had. People obsess over those small details from earlier visions.

They also tend to obsess over whats still left in the final game that isnt used. Like the cutting room floor wiki is an extremely popular website, and all they do is go through shipped games and data mine them and find things to help paint that development history a little more. For us, well, what if you could get rid of those abstraction layers completely and access those files and see whats going on in places that arent in the game anywhere? We hope that this is the start of normalizing this. We hope that authors will start throwing their old code on Github or the Internet Archive, just get things out there so that people can start using that as an educational resource, and start understanding development history on a level they havent before.

GamesBeat: What was one of the cool things you found digging through Monkey Island that you can share with us?

Cifaldi: Maybe not the coolest, but a couple things that come to mind. We already teased a room in the game that isnt in the final product. That one was particularly cool because its fully fleshed out. It seems finished. It has a really great piece of animation with this severed leg dripping blood. Whats fun about it is that theres this news report from 1990 thats been on YouTube for a few years, where they visited Lucasfilm Games, and they actually filmed Ron Gilbert showing off the Secret of Monkey Island while it was still in development, and the one place they show in the game is this room no one had seen before. People were like, whats this room? Where is this? Someone asked Ron, and he didnt even remember it. He had no idea. Things got cut all the time.

Above: The cut room!

Image Credit: Video Game History Foundation

Its one of the first things I look for, because I knew it was cut content, and I found it. Its in there. This room is not important to the game. Its not this huge grand idea. Its a room connecting two rooms, and in the final game they just connect to each other. This room separates them. That in itself is something that is worth talking about.

I think that a lot of the discussion around cut content in games tends to maybe amplify this level of mystique that was never there. Because game development is so secretive, because we dont tend to get behind the scenes access to game development, we tend to think of it as being sort of mythical, when in reality its a bunch of people collaborating and making something and cutting things out because it doesnt work, or because theres no more room on the disc or whatever.

Lewin: These arent all weighty decisions that changed the narrative or changed the ideas in the game. Sometimes a cut room is just a cut room.

Cifaldi: Right. If we start acknowledging that decisions are made for reasons, that things get cut for usually the right reasons, that we can sort of move focus away from these tiny details in the games development and start talking more about the process and what made the game unique and how the systems talk to each other and how decisions were made based on our knowledge now of how the game actually works. When I play a SCUMM game now, I feel like Im in the Matrix. I understand everything thats going on under the hood now. It helps me understand why decisions were made. Why this flame over here isnt animated, why the screen scrolls in this particular way. It gives me this intimate relationship with the game that I could have any other way. Im pretty thankful for that, and Im excited for other people to experience that too.

GamesBeat: Where was this cut room?

Cifaldi: Its on Monkey Island. Its basically a connection to the cannibal village. From the overhead map, you would click on the cannibal village, but before it took you to the village, it took you through this path that upped the tension a bit. At this point in the game, all you know is theyre cannibals. You dont know that theyre goofy cannibals that arent going to harm you. Youre walking through a path and seeing gore and horror and getting scared because youre about to go to a cannibal village and they might kill you. Its just a screen thats there with no purpose other than adding tension, I think. You cant do anything except walk through it. The only interaction is that when you get near the village, you can look at it, and he says something like, I cant see anything from way back here. Thats it. Its just a room you walk through to get to the village.

I mean, this is a world thats existed in our heads for decades. Its cool to flesh it out a little more. There are parts in the code that suggest to us that the developers thought fondly of it. Its not something where theyre like, ah, kill it. I think its in the script for the overhead map. The part of the code where you click on the hotspot to go into the cut room, its still there, but its commented out, so its not compiled into the game. Then theres a comment next to it that says, in memory of the unforgettable dripping leg. Something like that. They thought fondly of this room. It could be cut for various reasons. It doesnt really do anything for the game. It just slows it down. Thats one reason. But the other reason is that the biggest use of disc space was art, and this was a ton of art. It was not just one giant room. It was also eight frames of leg dripping animation and four frames of smoke animation. It was a lot of art. It might have been something that was cut for that reason. Incidentally, I have no reason to believe, having investigated this code, that a closeup of the dog was ever a thing in the game. Its on the back of the box. I think its just a piece of art. I dont think you ever talk to the dog and get a closeup in the game. Theres no evidence to support that. I think they restored something that was never there, is my take on that.

The RetroBeat is a weekly column that looks at gamings past, diving into classics, new retro titles, or looking at how old favorites and their design techniques inspire todays market and experiences. If you have any retro-themed projects or scoops youd like to send my way, pleasecontact me.

Follow this link:

The RetroBeat: The Video Game History Foundation is on the hunt for source code - VentureBeat

How Safe Is the US Election from Hacking? – The New York Review of Books

Patrick T. Fallon/Bloomberg via Getty Images

A voter casting a ballot on an electronic device in early voting, Los Angeles, California, October 29, 2020

In September, The New York Times reported on a concerning surge in Russian ransomware attacks against the United States, including against small towns, big cities and the contractors who run their voting systems, the full scale of which is not always disclosed. Last week, the newspaper further reported that Russia has in recent days hacked into state and local computer networks in breaches that could allow Moscow broader access to American voting infrastructure, but said that Russias ability to change vote tallies nationwide is limited, a caveat that seems more ominous than reassuring. Meanwhile, public officials and voting-machine vendors historically have not always been forthcoming with the public about the extent of security weaknesses and breaches. Election security advocates worry that this lack of transparency may leave the public exposed both to potential election theft and to false claims that election theft has occurred. In an effort to mitigate these risks, grassroots efforts around the country seek to make the 2020 election more transparent than past elections.

In August 2016, according to David Shimers book Rigged, the U.S. Intelligence community had reported that Russian hackers could edit actual vote tallies, according to four of Obamas senior advisors. But the only government official who publicly alluded to this possibility was then Senate minority leader Harry Reid. On August 29, 2016, Reid published a letter hed sent to then FBI director James Comey in which he said the threat of Russian interference is more extensive than is widely known and may include the intent to falsify official election results.

Reid has said that he believes vote tallies were changed in 2016. According to Rigged, Obamas leading advisors dismissed Reids theory, with a catch: they could not rule it out. James Clapper, Obamas director of national intelligence, told Shimer: We saw no evidence of interference in voter tallying, not to say that there wasnt, we just didnt see any evidence.

According to Rigged, the Department of Homeland Security (DHS) did not have independent surveillance abilities and just thirty-six local election offices had let them assess the security of their voting systems before the 2016 election. In January 2017, the DHS confirmed that it had conducted no forensic analysis to verify that vote tallies werent altered. In June 2017, it again confirmed that it had conducted no such forensic analysis and did not intend to do so. Senator Ron Wyden, Democrat of Oregon, has since said that As far as I can tell, no systematic post-election forensic examination of these voting machines took place. Whatever the reason for this failure to act, this administration cannot afford to repeat the mistakes of 2016.

Also in June 2017, The Intercept reported that Russia had attacked our election infrastructure and that the attack was more pervasive than either the Obama or Trump administrations had let onbased on a classified report leaked to the publication by Reality Winner, a twenty-eight-year-old Air Force veteran and National Security Agency contractor. Unfortunately for her, The Intercept published the document in such a way that the FBI was able to identify the source of the leak; Winner was arrested and tried under the Espionage Act. Sentenced to five years in prison, she is still serving her term.

In his duty to report a threat to the republic, the FBIs director was infinitely less forthcoming than Winner. In September 2016, James Comey testified to Congress that the vote system in the United Statesis very, very hard to hack into because [t]hese things are not connected to the Internet. The same month, the former Elections Initiatives director for Pew Charitable Trusts told Congress that I know of no jurisdiction where voting machines are connected to the Internet. This makes it nearly impossible for a remote hacker. Numerous other individuals, including Thomas Hicks, who has served as chairman of the US Election Assistance Commission (EAC) since 2014, have also told Congress that voting machines are not connected to the Internet.

Such reassurances were deeply misleading.

Before each election, all voting machines must be programmed with new ballots. They typically receive this programming via removable memory cards from county election management systems or computers outsourced to third parties. According to election security expert J. Alex Halderman and others, most election management systems can and likely do connect to the Internet from time to time or receive data from other, Internet-connected systems. In Haldermans view, according to the tech news site Cyberscoop, a determined attacker could spearphish the individuals responsible for programming the ballots and infect their devices with [vote-changing] malware that could spread via the memory cards to all of the voting machines in a county or state; and theres little visibility into how officials or third parties manage the ballot programming process and whether they use cybersecurity best practices.

Furthermore, Wisconsin and Florida approved in 2015 the installation of cellular modems in their Election Systems & Software (ES&S) precinct ballot scanners, which are used to count paper ballots (whether marked by hand or with a touchscreen). Poll workers use these modems to transfer unofficial vote totals from the precincts to the county election management systems (which include the county central tabulators) on election night. Official results are typically still transferred from precincts using memory cards or other removable media that are transported to the counties, a so-called sneakernet. Election security experts strongly advise against using cellular modems to transfer unofficial results because they say this practice provides an unnecessary opening for foreign nations and other remote attackers to infiltrate counties central tabulation systems. After breaching such tabulators, a hacker could install malware to change not only the unofficial vote tallies but also the official ones.

Federal guidelines for voting equipment are voluntary and do not currently bar the use of cellular modems. The National Institute of Standards and Technology (NIST) assists the EAC in developing these guidelines, and the agency is working on the next generation of them. According to a recent report in the Palm Beach Post, a NIST official cautioned the EAC in December 2019 that the use of wireless devices make the voting system a node on the internet that could provide an entryway for remote attackers.

Vendors and many election officials have ignored such warnings, sometimes claiming (falsely) that cellular transmissions dont connect to the Internet. Other times, they claim that the connection is so brief that it doesnt matter. But experts say that exposing an election system to the Internet even briefly on election night provides enough time for a determined attacker lying in wait to enter the system.

Last year, cybersecurity journalist Kim Zetter reported that a team of election security experts led by Kevin Skoglund had discovered that some election systems on the receiving end of these modem transmissions had been left online for months and perhaps years, not just a few seconds. These include systems in Florida (seven counties, including Miami-Dade), Michigan (four counties), and Wisconsin (nine counties).

In September this year, these same states received a letter signed by nearly thirty election security experts and election integrity organizations, recommending that election officials remove these modems. Susan Greenhalgh, the senior election security adviser for the nonprofit Free Speech for People, who led the initiative, told me that these swing state officials have not responded to the letters. Ion Sancho, who served as the supervisor of elections for Leon County, Florida, for almost thirty years and appears in the documentary films Hacking Democracy and Kill Chain, recently wrote follow-up letters to Florida county election officials in a final attempt to persuade them not to use the modems and to disconnect central servers from the Internet.

Justin Sullivan/Getty Images

Voters waiting in line at a polling station, Lawrenceville, Georgia, October 30, 2020

*

Of course, its not just foreign powers that we must worry about. As cybersecurity journalist Brad Friedman told me, an election commission headed up by President Jimmy Carter found after the controversy surrounding the secret tabulation of the election in Ohio in 2004, that election insiders remain the greatest threat to our elections. Election management systems, voting machines, memory cards, and USB sticks are among the many things that election insiders could corrupt. The software used in voting machines and election management systems is proprietary to the vendors, making it difficult to obtain permission to forensically analyze them. Experts say hackers could erase their tracks anyway. As a practical matter, the only way to know if electronic vote tallies are legitimate is to conduct full manual recounts or robust manual audits using a reliable paper trail. But most jurisdictions require manual recounts, if at all, only if the margin of victory is less than 0.5 percent. Thus, after the 2016 election, many experts and advocacy groups recommended legislation requiring robust manual election audits in 2020.

Earlier this year, though, Republicans blocked federal legislation, the SAFE Act, which would have required such audits for most federal races. Americas preeminent election-auditing expert, Philip Stark, a professor of statistics at the University of California at Berkeley, told me a few weeks ago that only a few jurisdictions currently audit elections in a way that has a good chance of catching and correcting wrong reported outcomes. That requires a trustworthy paper trailprimarily hand-marked paper ballots kept demonstrably secure throughout the election and the auditand [what is known as] a risk-limiting audit using that paper trail. But, to the best of my knowledge, even those states only audit a few contests in each election. (Emphasis added.) A report by the National Conference of State Legislators confirms that just three states (Colorado, Rhode Island, and Virginia) require risk-limiting audits for one or more races.

As I have previously reported, many election officials have also dispensed with hand-marked paper ballots (pen and paper) in favor of new touchscreen voting machines called ballot-marking devices (BMDs). If voters miss machine errors or omissions on the paper voter records marked by these touchscreens (some call them paper ballots, misleadingly), a risk-limiting audit cant detect that. A recent study found that voters themselves detected only 7 percent of such inaccuracies. According to Halderman, who led the study, even when a poll worker prompted voters to verify the printouts, they detected only 15 percent of such inaccuracies. The only measure that made a big difference was giving voters prefilled slates, such as completed sample ballots, to compare against the printoutat which point voters detected 73 percent of such inaccuracies. It is doubtful many voters know to ask for such a thing.

In February, the Associated Press reported that BMDs would be used by all in-person voters in four hundred counties in sixteen states. Pennsylvania, a crucial battleground state, will deploy them in two of its most populous counties, Northampton and Philadelphia, despite huge problems with them last year. In Philadelphia, per a Reuters report, Poll workers and technicians reported issues with the new machines at more than 40 percent of polling locations, yet the voting machine vendor ES&S said that it was simply inaccurate for anyone to imply there were widespread issues. In Northampton, which has been described as potentially dispositive in Pennsylvanias presidential race, the local Republican Party chairwoman said that the results of a November 2019 election cant be trusted because of the catastrophic failure of the machines on that occasion. We think voters were disenfranchised,she said.

Georgia, which is the only state in the nation with two Senate seats on the ballot, will deploy BMDs statewide in this election. Earlier this month, as reported by PBS News, a few counties found that the touchscreens were intermittently omitting some senatorial candidates from the review screens. The vendor claims to have fixed the problem by installing a last-minute software update on every machine in the state. Georgias secretary of state claims that voters can have confidence because it will conduct election audits starting in November. But per a recently adopted election rule, the state plans to audit just one race, chosen by the secretary of state, not at random. According to the Open Source Election Technology (OSET) Institute, Georgia lacks sufficient backup paper ballots in the event that these touchscreens fail.

Voter registration systems also raise transparency and security concerns. In 2019, it was reported that Russia had in 2016 breached Florida voter registration systems in Washington County, as well as at least one other county (Florida officials were obliged to sign a nondisclosure agreement as to the identity of that second county). The FBI told Florida lawmakers that it could not assess with certainty whether or not voter data had been changed.

Since the 2016 election, most states have installed devices to detect efforts at voter registration system intrusion, known as Albert sensors (after Albert Einstein), as a primary defense against hacking. As reported by Bloomberg in 2018, these sensors have a knack for detecting intrusions like those from Russian hackers and funnel suspicious information to a federalstate information-sharing center, known as the Elections Infrastructure Information Sharing and Analysis Center (an agency run by the Center for Internet Security, which Reuters describes as a nonprofit that helps governments, businesses and organizations fight computer intrusions). Per Bloomberg, Albert sensors are intended to help identify malign behavior and alert states quickly. But they cant block a suspected attack, and experts caution that theyre not deployed to most of the 9,000 local jurisdictions where votes are actually cast, and sophisticated hackers can sneak past the sensors undetected.

Similar security concerns plague electronic pollbooks, the tablet computers that poll workers use to check in voters and, more recently, also to activate the new touchscreen voting machines adopted in Georgia and elsewhere. Although all electronic election equipment is vulnerable, electronic pollbooks are particularly risky because they often rely on a Wi-Fi or Bluetooth connection. Despite these reliability and security issues, use of electronic pollbooks has risen significantly since 2016. In the 2018 midterm election, ES&S electronic pollbooks in Indiana failed due to connectivity issues in five out of seven of the counties that used them; one county clerk called it the worst election shed ever experienced in her eight years on the job. In Los Angeles County in March 2020, connectivity problems with new electronic pollbooks from a company called KnowInk wreaked havoc, causing delays in voting lines of as long as five hours.

Using electronic pollbooks to activate voting machines creates additional risks. According to PBS, e-pollbooks also caused problems, including displaying the wrong races and randomly shutting down, during Georgias primary elections in June. Again, the electronic pollbooks were supplied by KnowInk, whose managing director, Scott Leiendecker, is a former Republican election official. Leiendeckers wife donated to the campaign of Georgias Republican secretary of state before the state announced its contract with KnowInk. KnowInks product manager once campaigned for Ed Martin, the president of the Phyllis Schlafly Eagles, which opposes the Equal Rights Amendment. KnowInks products are now used in twenty-three states, as well as Canada.

Nor can we count on officials to tell the public if e-pollbooks or other systems are breached. In January this year, the FBI announced a change of policy, whereby it will alert state election officials of local election system breaches. It has not explained why it lacked such a policy previously. Nor has it committed to informing the public of breaches even after investigations have concluded. On August 4, 2020, Senator Richard Blumenthal, Democrat of Connecticut, posted on Twitter that he was shocked and appalled after leaving a 90-minute classified briefing on foreign malign threats to our elections. He wrote that Americans need to see & hear these reports, which, he said, ranged from spying to sabotage, yet Congress had been sworn to secrecyunacceptably. Later that month, Trump and his appointee intelligence leaders cancelled in-person congressional briefings about Russian interference, alleging prior improper leaks by Democrats.

Robyn Beck/AFP via Getty Images

Election workers preparing mail-in ballots for a signature verification machine, at a Los Angeles County processing center, Pomona, California, October 28, 2020

*

For his part, President Trump has deflected justified concerns about Russian hacking with unsubstantiated and fantastical claims about vote-by-mail. This includes the notion that foreign countries could counterfeit millions of mail ballots, which is not even a plausible method of fraud since election workers check mail ballots against voter registration lists. The theory was initially floated by Attorney General William Barr, who has admitted he has no actual evidence for it.

Earlier this month, Trumps partisan director of national intelligence, John Ratcliffe, who cancelled the congressional briefings on Russian interference, held a press conference, emphasizing that both Russia and Iran had obtained voter registration data, and stating that Iran had faked menacing emails from the far-right group the Proud Boys to voters. But voter registration data is publicly available in many states, and Ratcliffe did not say whether systems had been breached to acquire it. A few days later, The New York Times reported that many intelligence officials said they remained far more concerned about Russia [rather than Iran], which has in recent days hacked into state and local computer networks in breaches that could allow Moscow broader access to American voting infrastructure. Similarly, in August, House majority leader Nancy Pelosi and Intelligence Committee chairman Adam Schiff warned that the actions of Russia, China and Iran are not the same. Only one countryRussiais actively undertaking a range of measures to undermine the presidential election and secure the outcome that the Kremlin sees as best serving its interest.

Heading into this election, election security activists are seeking to counteract this lack of transparency regarding the electronic aspects of our election system. Protect Our Votes (a group I cofounded), Democracy Counts, and Transparent Elections North Carolina are all organizing volunteers to photograph precinct totalsas shown on precinct poll tapesafter polls close, and then compare them with the official reported totals for those precincts. Although these comparisons cannot detect hacking of precinct tallies, a discrepancy between precinct totals and reported totals could indicate hacking or other problems involving the county central tabulators.

In late 2015, a poll tape analysis conducted by Bennie Smith, an election commissioner in Shelby County, Tennessee, revealed that votes had disappeared from voting machines serviced and maintained by ES&S in predominantly African-American precincts during the countys municipal election held in October that year. The Republican county election administrator, Richard Holden, who had previously been investigated by the FBI, abruptly retired after Bloomberg reported the incident. A wave of electoral victories by African-American Democratic candidates for county office soon followed. This was a striking change from Holdens six-year tenure, during which Republicans had twice swept nearly all countywide races.

Earlier this year, an election integrity group known as Audit USA led efforts to stop Shelby Countys Republican-led election commission from entering into another contract with ES&S. The countys Democratic-led funding commission blocked the contract on October 12 amid concerns about the bidding process. But Smith, a Democrat, told me that Tennessees Republican secretary of state, Tre Hargett, has since given the county election commission new ES&S scanners for use in Novembers election. Smith said he has not been able to ascertain whether these eleventh-hour scanners include modems, a measure he said would violate state law.

Elsewhere, an organization named Scrutineers does voter education work that includes various election transparency projects. One such project, called Ask the Voters, aims to conduct postelection affidavit audits of precincts or small counties with anomalous results. Another organization to which I belong, the National Voting Rights Task Force (NVRTF), will document live reported vote totals at crucial states county websites to capture evidence of anyanomalies, such as vanishing votes. NVRTFs software program will automatically capture screenshots of the reported results every fifteen minutes. (The citizen task force is still seeking volunteers to assist with this aptly named Watch the Counties project.)

Meanwhile, the Columbus Institute for Contemporary Journalism has sent requests to every county in seven states (Florida, Michigan, Ohio, Pennsylvania, Wisconsin, Arizona, and North Carolina) to obtain detailed information about the election equipment they are using and who specifically is programming it. The organization is preparing to file lawsuits where viable in the event of election anomalies this November.

In addition, most digital scanners in use today automatically create images of the paper ballots, which can be used by the public to compare against electronic totals. Unfortunately, many election officials erase them, but at this election, Audit USA and Citizens Oversight are leading an effort to obtain these images and compare them with official election results.

Finally, voters can report malfunctioning election equipment, voter suppression actions, and other problems to See Say 2020, which will vet these reports and post details of incidents on an interactive map, both to inform the public and to serve as potential evidence if official results need contesting.

In an ideal world, such independent monitoring efforts and citizen initiatives would not be necessary. Americans could go to the polls, vote, and be sure their ballots would be counted in a free and fair election. But that is not the reality. Instead, we face an unprecedented combination of election interference from hostile foreign powers and a president intent on keeping the public confused and uninformed about threats to our election infrastructure. As another US president liked to say: trust, but verify.

Go here to read the rest:
How Safe Is the US Election from Hacking? - The New York Review of Books

NowThis partners with Calm to offer a soothing Election Day live stream – TechCrunch

Mobile news organization NowThis is announcing a bit of counter-programming for next weeks presidential election its partnering with meditation app Calm to create a live stream for anyone who needs relief from the stress of Election Day news.

Not that NowThis is exactly avoiding that news, but this live stream (combining breathing exercises and other meditative activities with peaceful nature footage from all 50 states) should offer a brief respite from obsessively checking results.

It will go live at 5:30 p.m. Eastern on Election Day on NowThis Facebook and YouTube pages, and it will run through the following day.

As the leading news brand for young people, NowThis often covers important news of the day with hope and optimism, said NowThis Chief Content Officer Tina Exarhos in a statement. With voters across the country experiencing a uniquely stressful election, NowThis is excited to partner with Calm to add counter-programming to our coverage, providing a respite for audiences as Election Day comes to a close.

NowThis is part of Group Nine Media, which also includes Thrillist, The Dodo and Seeker.

Calm, meanwhile, is reportedly looking to raise more funding to take advantage of growing interest in meditation and mindfulness apps. Plus, its already been moving beyond apps with a celebrity-filled show for HBO Max.

Visit link:
NowThis partners with Calm to offer a soothing Election Day live stream - TechCrunch

Farnell inks deal with Industrial Shields to distribute o… – evertiq.com

andreypopov dreamstime.comBusiness | October 30, 2020

The benefits of utilising open source hardware has been argues for quite some time, but one of the main benefits is the opportunity to bring industrial products to market at a lower cost and in less time than designing with proprietary processor boards.The new partnership strengthens Farnells industrial automation and control portfolio by providing more options to customers developing monitoring, control or automation solutions based on single board computers (SBCs). The addition of this new range of PLCs, Panel PCs and the Open Mote B ultra-low-power communications board for the Internet of Things (IoT) will further boost Farnell's catalog. Automation continues to grow in all domestic and industrial sectors including home and factory automation, large buildings and smart cities. The use of open source hardware removes any lock-in associated with proprietary PLCs, giving customers much more control and ownership of the design. Many designers are already familiar with the programming environment of Raspberry Pi and Arduino, which offers easy access to powerful tools and information about the design and operation of the products. Developers of industrial automation solutions also have the option to integrate a greater range of electronic components into their designs, reducing the cost of end products.Farnell will now stocks Industrial Shields full product range, including automation devices such as the Touchberry Pi 10.1 Panel PC, based on the Raspberry Pi 4B; the ARDBOX PLC product family, based on the Arduino Leonardo board; the Ethernet-enabled M-DUINO PLC family of PLCs. This new global franchise with Industrial Shields increases our range of PLCs and Panel PCs giving our customers greater choice of low cost and flexible devices to support their automation needs. The innovative use of open source hardware will create designs that speed time to market, reduce costs, and put the customer in control. This is a key addition to our automation range, bringing the benefits of industrial automation closer to our customers than ever before, says Lee Turner, Global Head of Semiconductors and SBC at Farnell in a press release.The collaboration with Farnell gives us the opportunity to continue growing. Thanks to this agreement, we are consolidating our position as the leading manufacturer of open source-based systems for industrial automation, and strengthening our company's international presence. In addition, the fact that a prestigious group such as Farnell is committed to Industrial Shields products gives our customers the peace of mind of using safe, quality systems, adds Albert Prieto, CEO of Industrial Shields.

Read more:
Farnell inks deal with Industrial Shields to distribute o... - evertiq.com