Baidu offers quantum computing from the cloud – VentureBeat

Following its developer conference last week, Baidu today detailed Quantum Leaf, a new cloud quantum computing platform designed for programming, simulating, and executing quantum workloads. Its aimed at providing a programming environment for quantum-infrastructure-as-a-service setups, Baidu says, and it complements the Paddle Quantum development toolkit the company released earlier this year.

Experts believe that quantum computing, which at a high level entails the use of quantum-mechanical phenomena like superposition and entanglement to perform computation, could one day accelerate AI workloads. Moreover, AI continues to play a role in cutting-edge quantum computing research.

Baidu says a key component of Quantum Leaf is QCompute, a Python-based open source development kit with a hybrid programming language and a high-performance simulator. Users can leverage prebuilt objects and modules in the quantum programming environment, passing parameters to build and execute quantum circuits on the simulator or cloud simulators and hardware. Essentially, QCompute provides services for creating and analyzing circuits and calling the backend.

Quantum Leaf dovetails with Quanlse, which Baidu also detailed today. The company describes Quanlse as a cloud-based quantum pulse computing service that bridges the gap between software and hardware by providing a service to design and implement pulse sequences as part of quantum tasks. (Pulse sequences are a means of reducing quantum error, which results from decoherence and other quantum noise.) Quanlse works with both superconducting circuits and nuclear magnetic resonance platforms and will extend to new form factors in the future, Baidu says.

The unveiling of Quantum Leaf and Quanlse follows the release of Amazon Braket and Googles TensorFlow Quantum, a machine learning framework that can construct quantum data sets, prototype hybrid quantum and classic machine learning models, support quantum circuit simulators, and train discriminative and generative quantum models. Facebooks PyTorch relies on Xanadus multi-contributor project for quantum computing PennyLane, a third-party library for quantum machine learning, automatic differentiation, and optimization of hybrid quantum-classical computations. And Microsoft offers several kits and libraries for quantum machine learning applications.

Read the original post:
Baidu offers quantum computing from the cloud - VentureBeat

Security researchers resolve crypto flaws in JHipster apps – The Daily Swig

John Leyden23 September 2020 at 11:27 UTC Updated: 24 September 2020 at 13:10 UTC

Nearly 4,000 pull requests were issued to fix dependant projects

UPDATED Security researchers have run a successfully exercise to refactor apps that inherited a cryptographic flaw from a vulnerable code generator, JHipster.

Both JHipster and JHipster Kotlin were updated in late June to break their reliance on a weak pseudo-random number generator (PRNG).

The vulnerability meant that an attacker who had obtained a password reset token from a JHipster or JHipster Kotlin generated service would be able to correctly predict future password reset tokens.

This made it possible for an unauthorized third party to request an administrators password reset token in order to take over a privileged account.

Web applications and microservices built using vulnerable version of either JHipster or JHipster Kotlin were not themselves fixed even after the code generating utilities were updated to fixed versions - JHipster 6.3.0 and JHipster Kotlin 1.2.0, respectively.

Software engineer Jonathan Leitschuh estimated in early July that there were as many as 14,600 instances of vulnerable applications generated using vulnerable builds of JHipster on GitHub.

BACKGROUND App generator tool JHipster Kotlin fixes fundamental cryptographic bug

Over the course of 16 hours, 3,880 pull requests were issued to fix instances of CVE-2019-16303, the PRNG vulnerability in the JHipster code generator.

The same underlying vulnerability also affected apps made using JHipster Kotlin.

The root cause of the problem in the case of both JHipster and JHipster Kotlin was reliance on Apache Commons Lang 3 RandomStringUtils to handle PRNGs.

The JHipster app patching exercise, supported by GitHub Security Lab, relied on a code refactoring tool developed by Jon Schneider of source code transformation startup Moderne.

Leitschuh told The Daily Swig: We plan to do this sort of thing again in the future with other vulnerabilities, but hopefully ones that are more complex and less cookie cutter.

JHipster is an open source package thats used to generate web applications and microservices. JHipster Kotlin performs the same functions to generate apps that are compatible with Kotlin, a modern cross-platform programming language.

This story has been updated and revised to reflect that the refactoring exercise focused on JHipster-generated apps and not JHipster, as first and inaccurately reported.

RECOMMENDED Critical XSS vulnerability in Instagrams Spark AR nets 14-year-old researcher $25,000

Continued here:
Security researchers resolve crypto flaws in JHipster apps - The Daily Swig

Communalism: The other virus in India | Opinion – Hindustan Times

Covid 19 is not the only virus stalking the nation. Hate is in the air with social media acting as a super-spreader. Last week, on the death of prominent social activist, Swami Agnivesh, a former Indian Police Service (IPS) officer N Nageswara Rao tweeted: Good riddance.. You were an anti-Hindu donning saffron clothesmy grievance against Yamraj (god of death) is why did he wait so long? After protests from several Twitter users, the social media site pulled down the offensive tweet.

Rao is no ordinary police officer. In 2018, he was appointed acting director of the Central Bureau of Investigation (CBI) and also director-general, Fire Services and Home Guards, before retiring in July. That an officer holding such high posts should make such a spiteful remark is perhaps a sign of the times; those who swear allegiance to the Constitution are now wearing the ideology of hate on their khaki uniform, to the point of wishing death on someone. Worse, Rao has defended his hate speech.

Recall a similar tweet when journalist-activist Gauri Lankesh was shot dead in 2017. Then, a Surat-based businessman, Nikhil Dadhich tweeted: A bitch died a dogs death and now all the puppies are wailing in the same tune! It was a disgusting remark that seemed to celebrate the assassination. It may even have passed unnoticed, but for an inconvenient truth: Dadhich was followed by Prime Minister Narendra Modi on the micro-blogging site. Once again, the individual was unapologetic.

Rao and Dadhich are not alone: There are thousands of anonymous Twitter handles, Facebook posts and WhatsApp groups that are designed to spread animosity between individuals and communities. Under the guise of being open-source platforms, the social media universe has created its own code of conduct where the lines between free speech and hate speech are often blurred.

These are, as an outstanding recent Netflix documentary, The Social Dilemma, puts it, the digital Frankensteins of our times, amoral beasts running amok in a social media jungle where the rules are being subverted to promote hatred and division.

This big tech-driven social media hate-machine is transitioning seamlessly into the news environment. Thus, to blame social media alone for stoking disharmony would be to run away from the nature of the virus. Hate is an infection that is contagious when it is normalised as has happened in recent years. The anti-minority dog-whistles, for example, are now so frequently espoused that their expression is almost seen as routine. The Indian Muslim as anti-national narrative has been deliberately and repeatedly pushed by a section of the power elite so as to acquire a potency of its own. When a rabble-rousing Union minister screams in an election meeting, Desh ke gaddaron ko and the crowd responds with Goli maaron saalon ko, there is little attempt made to rein in the minister. Or indeed when anti-Citizenship (Amendment) Act (CAA)protesters are identified by their clothes or illegal immigrants are referred to as termites, there is a brazen attempt to stoke religious prejudice. It is almost as if a hyper-polarised environment is a spur for incendiary communal rhetoric.

Just how far this normalisation of a narrative of hate and bigotry has travelled is best exemplified by the recent Sudarshan TV case, involving a series of programmes done by the channel to purportedly investigate a Muslim conspiracy to take over the civil services. A slogan UPSC jihad was put out as a promotional video. Rather than acting ab initio against a programme that was prima facie intended to vilify the Muslim community, the information and broadcasting (I&B)ministry allowed the telecast, saying it did not wish to pre-censor the programme. This despite the fact that the I&B programming code allows the ministry to prohibit a programme if it is likely to promote hatred or ill-will between communities. It required the Delhi High Court and then the Supreme Court to step in and stop the further broadcast of the programme before the ministry finally issued the channel a notice. Maybe, the ministry views Sudarshan TV with a more benevolent gaze since the channel is perceived to be in sync with the ruling partys ideology.

But while Sudarshan TV may espouse an unapologetic militant Hindutva worldview, what of those mainstream channels which quietly push a daily drip of communal poison and fake news with the sole objective of demonising a community? Take for example the lynching of two sadhus at Palghar in Maharashtra a few months ago. Some channels projected the killings as a Hindu-Muslim conflict while lining up extremists from both communities in a slugfest that passes as prime time debate. Now, when it turns out that the claims of a communal angle are false and all those arrested are local tribals who mistook the sadhus for kidnappers based on WhatsApp rumours, will any news channel publish an apology for having misled viewers to garner television rating points? Those news traffickers who seek to profit from hate must be acted against swiftly. Only then can we find a vaccine to the virus that threatens to divide us.

Post-script: Since we started with a story of a police officer, let me end with a police officer too. For over a year now, I have been receiving WhatsApp messages from a senior IPS officer echoing the strident Islamophobia which is so prevalent today. The officer was once in charge of a city with a large Muslim population. Is it any surprise then that law-enforcers are often caught on the wrong side of the law when there is a communal riot?

Rajdeep Sardesai is a senior journalist and author

The views expressed are personal

Visit link:
Communalism: The other virus in India | Opinion - Hindustan Times

TIBCO Aims to Drive Faster Adoption of Real-Time Analytics – RTInsights

The new offerings aim to help businesses modernize data management making it available to applications in real-time and manage data as a true business asset.

TIBCO Software today launched an initiative to accelerate the adoption of real-time analytics applications at a time when many organizations are accelerating digital business transformation initiatives to mitigate the impact of the economic downturn brought on by the COVID-19 pandemic.

Announced at an online TIBCO NOW 2020 conference, a TIBCO Cloud Data Streams offering combined with the latest edition of the TIBCO Spotfire analytics application is at the core of a TIBCO Hyperconverged Analytics platform that makes it possible to collect and analyze data in real-time.

See also: TIBCO Software Shares COVID-19 Analytics

At the same time, an existing TIBCO Responsive ApplicationMesh blueprint is being extended to make it simpler to achieve that goal, saysTIBCO CTO Nelson Petracek.

At the core of that framework are a bevy of updates to existingTIBCO offerings, including a Big Basin update to TIBCO Cloud Integration thatadds support for robotic process automation (RPA) capabilities alongside arevamped user interface.

TIBCO has also added TIBCO Cloud Mesh, which makes it simpler for IT teams to create and discover, for example. application programming interfaces (APIs) and integrations in TIBCO Cloud. The company has also updated TIBCO BusinessEvents to provide more contextual processing of events in real-time via integrations with open source Apache Kafka, Apache Cassandra, and Apache Ignite frameworks and databases.

Finally, TIBCO revealed that TIBCO Business ProcessManagement (BPM) Enterprise can now be deployed using containers and launched TIBCOAny Data Hub, a data management blueprint based on TIBCO Data Virtualization softwarethat can be connected to more than 300 data sources.

Data virtualization has emerged as a critical lynchpin for accelerating digital business transformation, says Nelson Petracek, TIBCO CTO. In an ideal world, organizations would centralize all their data within a data lake. However, building data lakes takes time that many organizations dont have as they race to reengineer business processes, notes Petracek. Data virtualization tools enable applications to access data without having to move it, which Petracek says enables IT teams to deploy applications capable of accessing data anywhere it happens to reside faster.

Data virtualization is at the core of those initiatives,says Petracek.

Organizations as they build and deploy these applications are also trying to move beyond batch-oriented processing to provide more responsive application experiences, notes Petracek. While there is currently a lot of focus on application development, Petracek says its also becoming apparent that the way data is managed needs to be modernized as well to make data available to applications in real-time.

Ultimately, organizations are finally moving toward managingtheir data as a true business asset, notes Petracek. The issue is determiningwhat data has the most business value, which in turn drives the digital businessprocesses around which the organization operates, says Petracek.

It may be a while before organizations modernize datamanagement across the entire enterprise. Data virtualization tools, however,clearly have a role to play in jumpstarting that process. The challenge now is figuringout first what data is required to drive a process and then rationalizing all theconflicting data that today resides in far too many application silos.

Excerpt from:
TIBCO Aims to Drive Faster Adoption of Real-Time Analytics - RTInsights

No pixel left behind: The new era of high-fidelity graphics and visualization has begun – VentureBeat

Presented by Intel

Everybody loves rich images. Whether its seeing the fine lines on Thanos villainous face, every strand of hair in The Secret Life of Pets 2, lifelike shadows in World of Tanks, COVID-19 molecules in interactive 3D, or the shiny curves of a new Bentley, demand for vivid, photorealistic graphics and visualizations continues to boom.

Were visual beings, says Jim Jeffers, senior director of Advanced Rendering and Visualization at Intel. Higher image fidelity almost always drives stronger emotions in viewers, and provides improved context and learning for scientists. Better graphics means better movies, better AR/VR, better science, better design, and better games. Fine-grained detail gets you to that Wow!

Appetite for high quality and high performance across all visual experiences and industries has sparked major advances and new thinking about how computer-generated graphics can quickly and efficiently be made even more realistic.

In this interview summary, Jeffers, co-inventor earlier in his career of the NFLs virtual first-down line, discusses the road ahead for a new era of hi-res visualization. His key insights include: a broadening focus beyond individual processors to open XPU platforms, the central role of software, the proliferation of state-of-the-art ray tracing and rendering, and the myth of one size fits all. (Just because GPU has a G in front of it, he says, doesnt mean its good for all graphics functions. Even with ray tracing acceleration, a GPU is not always the right answer for every visual workflow.)

Above: Intels Jim Jeffers

Take a look at some of todays big graphic trends and impacts: Higher fidelity means more objects to render and greater complexity. Huge datasets and an explosion of data require more memory and efficiency. The data explosion is outpacing what todays card memory can address, leading to demand for more system wide efficient memory utilization. AI integration is producing faster results and theres greater collaboration, from edge to cloud.

Theres another new factor: Interactivity. In the past, most data visualization was predominantly used to create static plots and graphs or an offline rendered image or video. This remains valuable today, but for simulations of real-world physics and digital entertainment, scientists and film makers want to interact with the data. They want to drill down to see the detail, turn the visualization around, and get a 360-degree view for better understanding. All that means more real-time operations, which in turn requires more compute power.

Above: A high-speed, interactive visualization of stellar radiation.Image credit: Intel and Argonne National Labs, Simulation provided by University of California/Santa Barbara.

For example, UC Santa Barbara and Argonne National Labs needed to study the temperature and magnetic fluctuations over time of simulated star flares to better understand how stars behave. To visualize that dataset with 3,000 time-steps (frames), each about 10 GBs in size, you need about 3 TB of memory. Considering a current high-end GPU with 24GB of memory, it would require 125 GPUs packed into between 10-15 server platforms to match just one dual socket Intel Xeon processor platform with Intel Optane DC memory that can load and visualize the data. Further, that doesnt even factor in the performance limitations of transferring 3D data over the PCIe bus and the 200-300 Watts of power needed per card in the processor platform they are installed in.

Pretty clearly, a next-gen approach is crucial for producing these rich, high-fidelity, high-performing visualizations and simulations even faster and more simply. New principles are driving state-of-the-art graphics today and will continue to do so.

No transistor left behind. High-fidelity graphics require real-world lighting plus more objects, at higher resolution, to drive compelling photorealism. A virtual room created with one table, a glass, a grey floor with no texture and ambient lighting isnt particularly interesting. Each object and light source you add, down to the dust floating in the air and reflecting light, creates the scene for real life experiences. This level of complexity involves moving, storing, and processing massive amounts of data, often simultaneously. Making this happen requires serious advancements across the computing spectrumarchitecture, memory, interconnect, and software, from edge to cloud. So the first huge shift is to leverage the whole platform, as opposed to a single processing unit. Platform includes all CPUs, GPUs, and potentially has other elements like Intel Optane persistent memory, perhaps FPGAs, as well as software.

A platform can be optimized towards a specialized solution such as product design or the creative arts, but it still uses one core software stack. Intel is actively moving in this direction. Over time, a platform approach allows us to continually deliver an evolutionary path to an XPU era, exascale computing, and open development environments. (More on that in a bit.)

No developer left behind. Handling all this capability and data pouring into the platform is complicated. How does a developer approach that? You have a GPU over here, two CPUs over there, and various specialized accelerators. There might be two individual CPUs at a data center platform, each with 48 cores, and with each core being its own CPU. How do you program that without blowing your mind? Or spending ten years?

Whats needed is a simplified, unified programming model that lets a developer take advantage of all the available hardware capabilities without re-writing code for every processor or platform. Modern, specialized workloads require a variety of architectures as no single platform can optimally run every single workload. We need a mix of scalar, vector, matrix, and spatial architectures (CPU, GPU, AI, and FPGA programmability) along with a programming model that delivers performance and productivity across all the architectures.

Thats what the oneAPI industry initiative and the Intel oneAPI product are about designing efficient, performant heterogeneous programming, where a single code base can be used across multiple architectures. The oneAPI initiative will accelerate innovation with the promise of portable code, provide easier lifts when migrating to new, innovative generations of supported hardware, and helps remove barriers such as single-vendor lock-in.

No pixel left behind. The other key piece of the platform is about open source rendering tools and libraries designed to integrate capabilities and accelerate all this power. High-performance, memory-efficient, state-of-the art tools such as Intels oneAPI Rendering Toolkit open the door to creating the film fidelity visuals not just across films/VFX and animation but also HPC scientific visualization, CAD, content creation, gaming, AR and VR essentially anywhere better images aligned with how our visual system processes them is important.

Ray tracing is especially important in this new picture. If you compare the animated visual effects from a movie ten years ago with a movie today, the difference is amazing. A big reason for this is improved ray tracing. Thats the technique that generates an image by tracing the path of light and then simulates the effects of its encounters with virtual objects to create better pixels. Ray tracing produces more detail, complexity, and visual realism than a typical rasterized scanline rendering.

Compute platforms and tools have been continually evolving to handle larger data sets with more objects and complexity. So, it has become possible to deliver powerful render capabilities that can accelerate all types of workloads: interactive CPU rendering, global illumination with physically based shading and lighting, selective image denoising, and combined volume and geometry rendering. Intels goal is to enable these capabilities to run at all platform scales on laptops, workstations, across the enterprise, HPC, and cloud.

Above: New ray tracing technology provides powerful capabilities far beyond todays GPUs. Expanding model complexity beyond basic triangles to other shapes (above) accelerates rendering and increases accuracy while eliminating pesky inaccurate artifacts. Image credit: Intel

One of the most important new advances is in primitives, or graphics building block shapes. Most products today, especially GPU-based products, are highly attuned to triangles only. Theyre the equivalent of an atom. So if you look at a globe in 3D, theyre showing you a mesh of triangles. Up leveling beyond triangles to other shapes, results in individual objects such as discs, spheres and 3D objects like a globe or hair to require less memory footprint and typically much less processing time than say 1M triangles. Reducing the number of objects and required processing can help you turn your film around faster, to say 12 months instead of 18, achieve higher accuracy and better visual results, and be photorealistic with fewer visible artifacts. These existing ray tracing features plus new ones will take advantage of Intels upcoming XPU platforms with Xe discrete GPUs.

A lot of this is already taking place. Take the example from University of California, Santa Barbara, and Argonne National Labs we mentioned before. Theyre using a ray tracing method called Volumetric Path Tracing to visualize magnetism and other radiation phenomena of stars. Using open-source software, several connected servers, with large random access plus persistent memory, researchers can load and interact (zoom, pan, tilt) with 3+ TB of time series data. That would not have been feasible with a GPU-focused approach.

Film and animation studios have been on the leading edge of this new technology. Tangent Studios, working together with Baozou studios as creators of Next Gen for Netflix, delivered motion blur and key rendering features in Blender with Intel Embree. Theyre now doing renders five to six times faster than before, with higher quality. Laika, a stop-motion animation studio, worked with Intel to create an AI prototype that accelerated the time needed to do image cleanup a painstaking job by 50%.

Above: Bentleys interactive online configurator brings buyers ultra-high res images of 10 billion orderable combinations of autos

In product design and customer experience, Bentley Motors Limited is using these pioneering open-source rendering techniques. Theyre generating, on-the-fly, 3D images of its luxury cars for a custom car configurator. Bentley and Intel demonstrated a prototype virtual showroom where buyers will interactively configure paint colors, wheels, interiors and much more. The prototype included 11 Bentley models rendered accurately with 10 billion possible configuration combinations which used 120-GB of memory per node. The whole platform and ten-server environment ran at 10-20 fps, with hyper-real visuals and interactively with AI based denoising via Intel Open Image Denoise. More on graphics acceleration at Bentley here.

These new approaches come as were on the doorstep of the exascale computational era a quintillion floating-point operations in one second. Creating high-performance systems that deliver those quintillion flops in a consumable way is a huge challenge. But the potential benefits could also be huge.

Think about a render farm effectively a supercomputing data center, likely with thousands of servers, that handle the computing needed to produce animated movies and visual effects. Today, one of these servers works on a single frame for eight, 16, or even 24 hours. Its typical for a 90-minute animated movie to have 130,000 frames. At an average of 12-24 hours of computation per frame, youre looking at between 1.5 and 3 million compute-hours. Not minutes, hours. Thats 171 to 342 compute years! Applying the exascale capabilities now being developed at Intel to rendering with large memory systems, distributed capability, smart software and cloud services could reduce that time dramatically.

Above: Exascale computing could to bring characters to life faster. Image credit: The Secret Life of Pets 2. Illumination Entertainment

Longer term, pouring exascale capability into a gaming platform or even onto a desktop could revolutionize how content gets made. A filmmaker might be able to interactively view and manipulate at 80% or 90% of a movies quality, for example. That would reduce the turn-around time, known as iterations, to get to the final shot. Consumers might have their own vision, and using laptops with such technology, could become creators themselves. Real-time interactivity will further blur the line between movies and games in exciting ways that we can only speculate about today, but ultimately make both mediums more compelling.

NASA Ames researchers have done simulations and visualization with the Intel oneAPI Rendering Toolkit libraries including wind tunnel likes effects on flying vehicles, landing gear, space parachutes, and more. When the visualization team showed their collaborating scientist an initial, basic rasterized visualization without ray tracing effects to check accuracy of the data, the scientist said yes, you are on the right track. and then a week later, with an Intel OSPRay ray traced version. The scientist said: Thats Great! Next time skip that other image, and just show me this more accurate one.

Innovative new platforms with combinations of processing units, interconnect, memory, and software are unleashing the new era of high-fidelity graphics. The picture is literally getting better and brighter and more detailed every day.

Learn more:

High Fidelity Rendering Unleashed (video)

##

Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and theyre always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact sales@venturebeat.com.

Read this article:
No pixel left behind: The new era of high-fidelity graphics and visualization has begun - VentureBeat

Investing In Kava – Everything You Need to Know – Securities.io

As Bitcoin approaches its 10 years anniversary, the worlds first and most successful cryptocurrency is still a mystery to many people in the market. Even as Bitcoin has made its way into the vocabulary of the masses, the average person still wonders how does Bitcoin work and what makes this computer money so valuable?

The crypto market continues to expand to new heights. Every week new blockchains, tokens, coins, and exchanges enter the market. Each of these products provides users with a valuable service. However, all of these technologies owe a nod of appreciation to the worlds original cryptocurrency Bitcoin.

Per Satoshi Nakamoto, Bitcoins anonymous creator, Bitcoin is a Peer-to-Peer Electronic Cash System. Lets examine this statement in depth to really grasp exactly what Nakamoto states here. Firstly, he states Bitcoin is a peer-to-peer network.

Bitcoin Whitepaper

Peer-to-peer transactions are direct transactions. A great example of this style of transaction is when you hand cash to someone. When you hand your neighbor $5 cash, that is a direct transaction. There was no intermediary involved. There was no account validation, or central bank approving your transaction. You acted freely.

Now look at the same transaction, but this time you pay with your debit card or a payment App. While it may appear as if the funds instantly transfer from your account to theirs, this is hardly the case. Your payment begins a long arduous journey that can take days.

First, your payment order checks with both banks to make sure that the accounts are valid and that there are funds in your account to send. Then your payment action is sent to a major payment processing firm. In most instances, this is Visa or MasterCard.

Next, your funds bounce around 30+ intermediaries before reaching their destination around 3 days later. Thats why when you refund Debit or Credit transactions it takes days to show up in your account.

All of these steps add more time to your transactions. Additionally, each intermediary and verification process tacks on a fee for their services. On top of all of these concerns, your transaction still must go through the regulatory channels. If for some reason, there is a discrepancy between your government and the persons government you want to send a payment to, you will find it impossible to send these funds.

Bitcoin Trading Analysis via CoinMarketCapBitcoin Trading Analysis via CoinMarketCap

The reason behind all of these intermediaries is simple, the current financial system is centralized. In a centralized system, there is one central organization, such as a bank or government that holds all the power. They hold your funds, they approve your transactions, and they decide when to issue more currency. Youre just along for the ride.

In a decentralized network, you remain in control of your assets until the exact moment that they arrive at their destination. When you send Bitcoin from your wallet to another persons wallet, there are no intermediaries between your payment and its destination.

As such, there is no third-party to approve or deny your transactions. The entire process occurs in a peer-to-peer fashion. Its the same as handing someone digital cash. Basically, you regain control over your finances using a decentralized system.

At first, the concept of decentralization can seem a bit awkward to comprehend. However, a quick glimpse into the market and you will see other decentralized systems hard at work. A perfect example of a decentralized system that you are more than likely familiar with is torrent streaming services.

BitTorrent Token

When you go to a torrent streaming website, you probably ask yourself how do these platforms remain open, even though they offer products that they dont have licenses to offer? The answer is simple, they utilize decentralization to prevent censorship. Heres how decentralization is used in this scenario to bring you all your favorite early releases and new music for free.

Websites like BitTorrent dont actually provide you with any content. In reality, they just provide a location for people to meet up and exchange data freely, whatever that data may be. Now, granted, in most cases its music or movies, but it could be anything from political messages to actual value, such as cryptocurrencies.

Because these websites only provide a location for people to meet and exchange data, they are much more difficult to close than a centralized website that offered you these downloads directly. In essence, these streaming websites have done nothing wrong.

The same concepts can be put to use in the financial sector. Though the integration of decentralization, it becomes impossible to censor, edit, or block payments on the blockchain. In this way, Bitcoin represents an ideological shift towards more financial freedom and decoupling of government from currency.

To understand Bitcoin, you first need to take a look at some of the core technologies that make this marvelous coin function. As you now know, decentralized networks are censorship-resistant. There are also a variety of different types of decentralized networks. Bitcoin relies on a blockchain network to provide you with these freedoms.

A blockchain is a decentralized network that utilizes blocks of transactions to create a complete chain of events from the initiation of the network. In Bitcoins blockchain network, there are thousands of transaction validators known as miners or nodes. Importantly, every node validates every transaction on the blockchain but not every node receives a reward.

These miners compete against each other via a complicated mathematical equation. The node that gets the question correct first gets to add the next block of transactions to the blockchain. They receive a reward for their efforts. Today, the reward is set at 6.5 BTC.

Bitcoin Consensus Mechanism SHA-256 Blockchain Technology

The mathematical equation, known as SHA-256 is so difficult that your computer examines it and decides its better to make educated guesses rather than attempt to calculate the equation directly. This guesswork is what drives up the processing on your computer, which, in turn, drives up mining costs.

When you hear that someone has a Bitcoin mining rig, this simply means that they have a specially built computer processor tailored to the SHA-256 algorithm. These devices, known as Application Specific Integrated Chips (ASIC) miners are thousands of times more accurate at guessing the SHA-256 algorithms answer.

The cool thing about Bitcoin is that its not purely mathematical. There is a true psychological approach behind its nature. For example, the larger the Bitcoin network, the more secure it becomes, and the higher the value of BTC. Also, the higher the market value of Bitcoin, the more miners in the market.

As the value of Bitcoin rises, the SHA-256 algorithm adjusts accordingly. These adjustments ensure that the mining rewards get paid out around every ten minutes. These rewards are vital to the Bitcoin network for two main reasons. Firstly, this strategy incentivizes nodes to continue validating transactions.

Secondly, these rewards are the only time that new BTC enters the market. There will only ever be a total of 21 million BTC available to the world. The difficulty adjustment and mining rewards system of BTC ensures that these BTC enter the market in a concise and predictable manner.

Bitcoin Mining Rig How Does Bitcoin Work?

Now lets compare this sound mathematical process to that of the Central Bankers today. In the centralized financial system, the issuance of currency is done at a whim. Just recently, the US government issued trillions in currency into the market as a part of the Covid-19 stimulus package. However, these funds are sure to disrupt the delicate supply-and-demand balance. Consequently, inflation is sure to come soon.

The world needs Bitcoin now more than ever. Bitcoin represents a real danger to the centralized markets because, for the first time in history, it provides the world with a secure digital alternative to the fiat systems in place. Unlike its predecessor, gold, Bitcoin is available to the entire world and requires very little overhead in terms of security.

Now, lets compare gold and Bitcoin for a second to see why cryptocurrencies are the future reserve currencies of the world. Firstly, its important to acknowledge that gold did and still serves an important purpose in the market as a safe-haven for investors. Gold is extremely stable and universally accepted.

The problems with gold are systemic. For one, gold only functions as a reserve currency. You couldnt use gold for day-to-day micro-transactions. Imagine going to your local grocery store and chipping off some gold to pay for your items, not realistic in 2020 at all.

Additionally, gold isnt an asset that you can readily get your hands on. Sure there are tons of gold investors today, but what do they really own? If your gold isnt in a safe located on your property, you really just own a piece of paper that states you own gold. Sadly, in times of great economic strife, gold owners learn this lesson the hard way. Really, for any reason, your gold can be taken.

A perfect example of gold investors coming to terms with reality occurred in the 1930s in the US. During this time, the government of Franklin D Roosevelt seized all the citizens gold bullion and coins via Executive Order 6102. The order forced all citizens to sell their gold to the governmentat well belowmarket rates. Those that refused had their gold confiscated.

Bitcoin holders never have to worry about this scenario. You hold your Bitcoin directly, not just a note of ownership. Bitcoin relies on a pair of cryptographic keys to keep your holdings safe. The public key is what you give people so they can send you BTC, whereas the private key is how you access your wallet. You must never give your private key out to anyone.

As you already learned, the decentralized nature of Bitcoins network is set up in a way that it would be impossible for governments to stop it. Additionally, the security keys also prevent overreaching governments from snagging your hard-earned BTC whenever they deem it necessary.

Bitcoin functions as both a currency and a store of value. You can HODL your BTC and enjoy the appreciation, or you can trade or spend your Bitcoin with impunity. This unique currency affords investors the flexibility of cash, the convenience of digital transactions, and the value storage capabilities of gold.

The future for Bitcoin looks bright. The network is larger and more secure. Also, more people know about this revolutionary protocol than ever. The worlds first crypto also gained some new functionality recently via the Lightning Network.

After the crypto craze of 2017, it became evident that BTCs scaling issues needed resolution. The network traffic reached a point that BTC was unable to fulfill one of its primary roles. It was unable to function as a peer-to-peer cash system due to extreme volatility, delayed transaction times, and huge fees.

Luckily, developers have since corrected many of these issues via updates and other developments. The Lightning Network is one of these developments that continue to garner attention in the market. The Lightning Network is an off-chain protocol that relies on private payment channels to reduce network congestion.

Additionally, the Lighting Network provides BTC with some new functionality such as the ability to utilize smart contracts and oracles. Oracles are off-chain sensors that can trigger on-chain events such as smart contracts.

Today, Bitcoin is a household name. Amazingly, Nakamotos single coin inspired a digital revolution in the market. There are thousands of cryptocurrencies now available to investors. While many of these platforms improve upon Bitcoins core design, none can match Bitcoins network strength and overall community support. For this reason, Bitcoin continues to reign as the king of cryptocurrencies.

See the original post here:
Investing In Kava - Everything You Need to Know - Securities.io

The Power and Paradox of Bad Software – WIRED

When I go to the doctor, they ask what I do, and when I tell them, they start complaining to me about the software at the hospital. I love this, because I hate going to the doctor, and it gives us something to talk about besides my blood pressure.

This is a pattern in my life: When I'm asking at the library reference desk, chatting with the construction contractor with her iPad, or applying for a loan at the bank, I just peer over their shoulder a bit while they're answering a questionnot so much to be intrusiveand give a low little whistle at the mess on their screens. And out pours a litany of wasted hours and bug reports. Now I've made a friend.

Good software makes work easier, but bad software brings us together into a family. I love bad software, which is most of it. Friends text me screenshots of terrible procurement systems, knowing that I will immediately text back, BANANACAKES. I'll even watch videos of bad software. There are tons on YouTube, where people demo enterprise resource-planning systems and the like. These videos fill me with a sort of yearning, like when you step inside some old frigate they've turned into a museum.

Best I can tell, the bad software sweepstakes has been won (or lost) by climate change folks. One night I decided to go see what climate models actually are. Turns out they're often massive batch jobs that run on supercomputers and spit out numbers. No buttons to click, no spinny globes or toggle switches. They're artifacts from the deep, mainframe world of computing. When you hear about a climate model predicting awful Earth stuff, they're talking about hundreds of Fortran files, with comments at the top like The subroutines in this file determine the potential temperature at which seawater freezes. They're not meant to be run by any random nerd on a home computer.

This doesn't mean they're inaccurate. They're very accurate. As code goes, the models are amazing, because they're attempts to understand the entire, actual Earth via programming. All the ocean currents, all the ice and rain, all the soil and light. And if you feel smart, reading a few pages of climate model code will fix you up tout suite. If you, too, would like to know exactly how little you know about the machinery of the natural world, go on GitHub and look through the Modular Ocean Model 6, released by the National Oceanic and Atmospheric Administration, which is part of the Department of Commerce. Only America would make the weather report to money.

The software people get amazing tools that let them build amazing apps, and the climate people get lots of Fortran. This is one of the weirdest puzzles of this industry.

Every industry or discipline has its signature software. Climate has big batch climate models. Sales has the CRM, hence Salesforce. Doctors have those awful health care records systems; social scientists use SPSS or SAS or R; financial types plug everything into Excel. There are big platforms that help people do all kinds of work. But you know what blows them away? Software for making software. The software industry's software is so, so good (not that people don't complain). Just take a look at the modern IDE (integrated development environment), the programs programmers use to program more programs. The biggest are made by tech giants: Xcode (Apple) and Visual Studio (Microsoft) and Android Studio (Google), for example. I love to mock software, and yeah, these programs are huge and sprawling, but when I open these tools I feel like a medieval stonemason dragged into midtown Manhattan and left to stare at the skyscrapers. My mouth hangs open and my chisel falls from my sandstone-roughened hands.

In an IDE you drag buttons around to make the scaffolding for your apps. You type a few letters and the software guides your hand and finishes your thoughts, showing you functions inside of functions and letting you pick the right one for the task. Ultimately you click a little triangle (like Play on a music player) and it builds the app. I never get over it. And they give it away for free, so that people use it to make more software, which is why all the real estate in New York City is worth around a trillion and a half bucks, and Apple, which takes its famous 30 percent cut in the App Store, is worth $2 trillion. Of course, that's a down payment when you consider what we're going to pay to mitigate climate change.

Excerpt from:
The Power and Paradox of Bad Software - WIRED

Peace Train: Chelsea Manning, Julian Assange caught in repressive regime – Colorado Daily

  1. Peace Train: Chelsea Manning, Julian Assange caught in repressive regime  Colorado Daily
  2. Assange revelations among most important in US history, says Daniel Ellsberg  ComputerWeekly.com
  3. At Assanges Extradition Hearing, Troubled Tech Takes Center Stage  The New York Times
  4. Assange on Trial: Diligent Redactions and Avoiding Harm  CounterPunch
  5. Assange's extradition hearing is a farce: defend freedom of speech!  In Defence of Marxism
  6. View Full Coverage on Google News

More:
Peace Train: Chelsea Manning, Julian Assange caught in repressive regime - Colorado Daily

Why are Amnesty International monitors not able to observe the Assange hearing? – Amnesty International

Earlier this month, the street outside the Old Bailey criminal court in London, where Julian Assanges extradition hearing has been taking place, was transformed into a carnival.

Inside the Old Bailey, the courtroom has turned into a circus. There have been multiple technical difficulties, a COVID-19 scare which temporarily halted proceedings and numerous procedural irregularities including the decision by the presiding judge to withdraw permission for Amnesty Internationals fair trial observer to have access to the courtroom.

If the outside was a carnival, the inside of the court soon became a circus

Arriving at the court each morning was an assault to the senses with the noise of samba bands, sound systems and chanting crowds and the sight of banners, balloons and billboards at every turn.

The first day of the hearing, which started on Monday 7 September, drew more than two hundred people to gather outside the court. People in fancy dress mingled with camera crews, journalists and a pack of hungry photographers who would disappear regularly to give chase to any white security van heading towards the court, pressing their long lenses against the darkened windows.

One of the vans had come from Belmarsh high security prison, Julian Assanges home for the last 16 months.

The Wikileaks founder was in court for the resumption of proceedings that will ultimately decide on the Trump administrations request for his extradition to the US. The American prosecutors claim he conspired with whistleblowers (army intelligence analyst Chelsea Manning) to obtain classified information. They want him to stand trial on espionage charges in the US where he would face a prison sentence of up to 175 years.

Assanges lawyers began with a request that the alleged evidence in a new indictment handed down in June be excluded from consideration given that it came so late. The Judge denied this. In the afternoon session, the lawyers requested an adjournment until next year to give his lawyers time to respond to the US prosecutors new indictment. They said they had been given insufficient time to examine the new allegations, especially since they had only limited access to the imprisoned Assange. Indeed, this most recent hearing was the first time in more than six months that Julian Assange had been able to meet with his lawyers. The judge rejected this request.

We requested access to the court for a trial monitor to observe the hearings, but the court denied us a designated seat in court

Reacting to the decision, Kristinn Hrafnsson the editor-in-chief of Wikileaks told me that: the decision is an insult to the UK courts and to Julian Assange and to justice. For the court to deny the request to adjourn is denying Assange his rights.

Amnesty International had requested access to the court for a trial monitor to observe the hearings, but the court denied us a designated seat in court. Our monitor initially did get permission to access the technology to monitor remotely, but the morning the hearing started he received an email informing us that the Judge had revoked Amnesty Internationals remote access.

We applied again for access to the proceedings on Tuesday 8 September, setting out the importance of monitoring and Amnesty Internationals vast experience of observing trials in even some of the most repressive countries.

The judge wrote back expressing her "regret" at her decision and saying: I fully recognise that justice should be administered in public". Despite her regret and her recognition that scrutiny is a vital component of open justice, the judge did not change her mind.

If Amnesty International and other observers wanted to attend the hearing, they would have to queue for one of the four seats available in a public gallery. We submitted a third application to gain direct access to the overflow room at the court where some media view the livestream, but this has also been denied.

Amnesty International have monitored trials from Guantanamo Bay to Bahrain, Ecuador to Turkey. For our observer to be denied access profoundly undermines open justice

The refusal of the judge to not to give any "special provision" to expert fair trial monitors is very disturbing. Through its refusal, the court has failed to recognize a key component of open justice: namely how international trial observers monitor a hearing for its compliance with domestic and international law. They are there to evaluate the fairness of a trial by providing an impartial record of what went on in the courtroom and to advance fair trial standards by putting all parties on notice that they are under scrutiny.

Amnesty International have monitored trials from Guantanamo Bay to Bahrain, Ecuador to Turkey. For our observer to be denied access profoundly undermines open justice.

In the court, the overflow room has experienced ongoing technical problems with sound and video quality. More than a week after the proceedings began, these basic technical difficulties have not been properly ironed out and large sections of witness evidence are inaudible. These technological difficulties were not restricted to the overflow room. In court, some witnesses trying to call into the court room last week, were not able to get in. These basic technical difficulties have hampered the ability of those in the courtroom to follow the proceedings.

If Julian Assange is silenced, others will also be gagged either directly or by the fear of persecution and prosecution

We are still hopeful that a way can be found for our legal expert to monitor the hearing because the decision in this case is of huge importance. It goes to the heart of the fundamental tenets of media freedom that underpin the rights to freedom of expression and the publics right to access information.

The US governments unrelenting pursuit of Julian Assange for having published disclosed documents is nothing short of a full-scale assault on the right to freedom of expression. The potential chilling effect on journalists and others who expose official wrongdoing by publishing information disclosed to them by credible sources could have a profound impact on the public's right to know what their government is up to.

If Julian Assange is silenced, others will also be gagged either directly or by the fear of persecution and prosecution which will hang over a global media community already under assault in the US and in many other countries worldwide.

The US Justice Department is not only charging a publisher who has a non-disclosure obligation but a publisher who is not a US citizen and not in America. The US government is behaving as if they have jurisdiction all over the world to pursue any person who receives and publishes information of government wrongdoing.

If the UK extradites Assange, he would face prosecution in the USA on espionage charges that could send him to prison for the rest of his life possibly in a facility reserved for the highest security detainees and subjected to the strictest of daily regimes, including prolonged solitary confinement. All for doing something news editors do the world over publishing information provided by sources, that is in the interest of the wider public.

It is ironic that no one responsible for potential war crimes in Iraq & Afghanistan has been punished. Yet the publisher who exposed these potential crimes is the one in the dock

Outside the court, I bumped into Eric Levy, aged 92. His interest in Assanges case is personal. He was in Baghdad during the American shock and awe bombardment in 2003 having travelled to Iraq as part of the Human Shield Movement aiming to stop the war and failing that - to protect the Iraqi population.

Im here today for the same reason I was in Iraq. Because I believe in justice and I believe in peace, he tells me. Julian Assange is not really wanted for espionage. He is wanted for making America look like war criminals.

Indeed, it is ironic that no one responsible for possible war crimes in Iraq and Afghanistan has been prosecuted, let alone punished. And yet the publisher who exposed their crimes is the one in the dock facing a lifetime in jail.

See the rest here:
Why are Amnesty International monitors not able to observe the Assange hearing? - Amnesty International

Here’s to you, Julian Assange! – DiEM25

There is an old joke, from the time of the World War I, about the exchange of telegrams between the German army headquarter and the Austrian-Hungarian one: from Berlin to Vienna, the message is The situation on our part of the front is serious, but not catastrophic, and the reply from Vienna is: With us, the situation is catastrophic, but not serious.

The reply from Vienna seems to offer a model for how we tend to react to crises today, from Covid-19 pandemic to forest fires (not only) in the West of the US: yeah, we know a catastrophe is pending, media warn us all the time, but somehow we are not ready to take the situation quite seriously

Its a legal and moral catastrophe just recall how he is treated in prison, unable to see his children and their mother, unable to communicate regularly with his lawyers, a victim of psychological torture so that his survival itself is under threat. They are for certain killing him softly, as the song goes. But very few seem to take his situation seriously, with an awareness that our own fate is at stake in his case.

The forces which violate his rights are the forces which prevent the effective battle against global warming and the pandemic. They are the forces because of which the pandemic makes the rich even richer and hits the hardest the poor. They are the forces which ruthlessly exploit the pandemic to assert their control over our social and digital space, regulating and censoring it at our expense the forces which protect us, but also from our own freedom.

Assange fought for the public transparency of the digital space, and there is a cruel irony in the fact that the pandemic is used as a pretext to isolate him from his family and his defense. We are always ready to protest the limitation of basic human freedoms imposed on Hong Kong by China should we not turn the gaze back on ourselves? Today one should remember Max Horkheimers old saying from late 1930s: Those who dont want to talk critically about capitalism should also keep silent about Fascism. Our version is: those who dont want to talk about the injustice done to Assange should also keep silent about the violation of human rights in Hong Kong and Belarus.

Now that Assanges very survival is at stake, only such a movement can (perhaps) save him. Remember the lyrics (written by Joan Baez to Ennio Morricones music) of Heres to you, the title song of the movie Sacco and Vanzetti:

Heres to you, Nicola and Bart / Rest forever here in our hearts / The last and final moment is yours / That agony is your triumph.

There were mass gatherings all around the world in defense of Sacco and Vanzetti and the same is needed now in defense of Assange, although in a different form. Assange cannot die even if he dies (or disappears in a US prison cell like a living dead), that agony will be his triumph, he will die in order to live in all of us. This is the message we all must deliver to those who held him: if you kill a man, you create a myth which will continue to mobilize thousands.

The message to us of those who are after Assange is clear: everything is permitted (to us). Why only to them? What they are doing to Assange is radically changing the political weather, so perhaps we need new Weathermen.

Photo: drawing of Julian Assange by Daniel Fooks.

Photo Source: Daniel Fooks on Twitter.

Do you want to be informed of DiEM25's actions? Sign up here

Here is the original post:
Here's to you, Julian Assange! - DiEM25