All Things Open expects to draw thousands worldwide for all-digital conference – WRAL Tech Wire

All Things Open, the largest open-source event in the Southeast, is bringing its two-day conference experience online this fall. Held on October 19-20, All Things Open 2020 is expected to draw thousands of designers, developers, entrepreneurs and technologists worldwide.

Now entering its eighth consecutive year, All Things Open is a pillar of the Triangles tech ecosystem. The event takes place in Raleigh every fall and, in recent years, it has amassed crowds of over 4,000 people from around the world. Last years conference brought in the largest turnout ever, with 4,985 attendees from 41 states and 27 countries. The 2018 program drew 4,079 attendees.

All Things Open Founder Todd Lewis is optimistic that the turnout will likely double for the virtual iteration of the program. Hes expecting more than 10,000 developers, technologists and open-source leaders from around the world to join. We are removing the travel barrier, so we expect a more global audience, Lewis adds.

Though the online venue will naturally change the end-attendee experience, Lewis believes All Things Open 2020 can come close to replicating the in-person atmosphere. Being virtual will allow us to do some thingsnot possible when in person and will allow us to feature speakers we never would have otherwise been able to host, Lewis says.

For the first time ever, tickets to this years event are completely free. Previously, All Things Open tickets would start at around $150. Though attendees can be admitted for free in 2020, VIP tickets with extra perks are also available.

Three co-located events will take place on day one (October 19) of All Things Open 2020:

Lewis says that these three sub-events, in addition to the more traditional keynote, workshop and track session programming, will provide more variety to attendees than ever before.

Day two of All Things Open will include several 15-minute plenary talks and keynotes, along with 45-90 minute sessions in multiple content tracks.

Ahead of October, organizers are busy finalizing the schedule and roster of speakers. All Things Open aims to have more than 200 open-source technologists and subject matter experts in the speaker lineup. Over 100 people have been announced so far, including developers, managers and executives from Facebook, Google, Microsoft, Amazon Web Services, Spotify, RedHat, SAS, GitHub, Fidelity Investments, PayPal and more major tech firms.

Lewis says that the team is structuring sessions and sub-events to maximize the attendee experience and allow for ample education and networking. The online event platform, he adds, is intuitive and easy for attendees to navigate.

The platform were using will really help connect people and enable new relationships and communication, Lewis said. These new relationships have long-tail value and can benefit attendees for years into the future.

All Things Open 2020

All Things Open draws thousands as tech world prepares for more disruption

Three teams to pitch for prizes at NC Open Pass DataPalooza Finale

All Things Open conference returns with focus on industry disruption, innovation & inclusion

All Things Open returns to Raleigh, featuring 200+ sessions, startup pitch & more

Continued here:
All Things Open expects to draw thousands worldwide for all-digital conference - WRAL Tech Wire

Welcome to the freewheeling world of crypto lending – The National

It sounds like a surefire bet. You lend money to a borrower who puts up collateral that exceeds the size of the loan, and then you earn interest of about 20 per cent. What could possibly go wrong?

That's the proposition presented by "DeFi", or decentralised finance, peer-to-peer cryptocurrency platforms that allow lenders and borrowers to transact without the traditional gatekeepers of loans: banks.

And it has exploded during the Covid-19 crisis.

Loans on such platforms have risen more than seven-fold since March to $3.7 billion (Dh13.5bn), according to industry site DeFi Pulse, as investors hunt returns at a time when central banks across the world have slashed interest rates to prop up economies battered by the pandemic.

Proponents say DeFi sites, which run on open-source code with algorithms that set rates in real-time based on supply and demand, represent the future of financial services, providing a cheaper, more efficient and accessible way for people and companies to access and offer credit.

But with the promise of high rewards comes high risk.

Lawyers and analysts say such sites are vulnerable to coding bugs and hacks, and most are untested at scale and unregulated the latter typical of much of a global cryptocurrency sector mistrustful of the financial establishment.

Critics warn the technology could be the next overblown bubble of the crypto world, akin to initial coin offerings (ICOs), with inexperienced investors at particular risk. In 2017, billions of dollars poured into ICOs, where companies raised capital by issuing new virtual coins. Most projects failed to gain traction, and many investors lost their money.

"These are experiments in finance," says Preston Byrne of law firm Anderson Kill in New York. "They're not necessarily legally compliant in a lot of cases. But that doesn't mean that they can't be at some [point in the] future."

My portfolio is a couple of thousand dollars. I trade for fun, to discover new technologies such as decentralised finance.

Antoine Mouran, university student

DeFi is nonetheless surging in popularity.

Seven years ago, Brice Berdah dreamt of retiring in his mid-30s. He worked out what he would need to save: "The exact amount was 1.7 million euros [Dh7.4m]. My plan was to make 5 per cent on my capital."

Reality, though, scuppered his plans. Low interest rates meant his savings stagnated, while inquiries into real estate and car-parking businesses came to naught.

"By 27, I had only saved about 0.5 per cent of the required amount," says Mr Berdah, who works at a start-up that makes digital wallets for storing digital coins. "It was an obvious failure."

To resurrect his dream, Mr Berdah, now 28 and based in Paris, has turned to DeFi.

"Now I'm using DeFi, I've readjusted my retirement plans," he says, adding that he's bet 90 per cent of his net worth on DeFi. "Returns are about 20 to 25 per cent over the last six months ... and I'm on track just now."

While DeFi's roots are in a crypto sector hostile to mainstream finance, some of its aims like cutting costly steps and paperwork in financing have caught the attention of the firms it seeks to undermine.

In the future, backers say, bonds or stocks will be issued and traded directly on their blockchain-based platforms instead of by investment banks or centralised exchanges. Code, not humans, will oversee the processes, they say.

For their part, major banks are looking at how such technology can be used to complement, rather than upend, established finance. Goldman Sachs, for example, has hired a new head of digital assets to look at how assets can exist on blockchain technology, says a representative.

"There is an actual value on what is being built on these protocols," says Maya Zehavi, a blockchain consultant and board member of an Israeli blockchain industry group. "It might end up being an instant financialisation ecosystem for any project. That's the promise."

Most DeFi platforms are based on the ethereum blockchain, the backbone for ether, the second-biggest cryptocurrency after Bitcoin. Unlike Bitcoin, ethereum's blockchain can be used to create digital contracts, while developers can more easily build new software or apps on it.

Loans are recorded, issued and managed by the blockchain-based contracts. Borrowers must offer collateral, also in cryptocurrency, usually worth more than the loans they take out.

DeFi is not for the faint-hearted. Borrowers are typically traders who take out loans in say, ethereum, then use the coins to trade on various exchanges against other cryptocurrencies. They then aim to pay back the loan and pocket their profits, comparable to short-sellers in stock markets.

One such borrower is Antoine Mouran, a computer science student at a university in Lausanne, Switzerland.

Mr Mouran borrows the USD Coin cryptocurrency on lending platform Aave, and then uses the loan to trade Lend coins.

The profits on a typical trade? Depending on the starting price, they can reach 30 per cent, Mouran says.

"My portfolio is a couple of thousand dollars," the 18-year-old adds. "I trade for fun, to discover new technologies such as decentralised finance."

Aave has been a big beneficiary of the recent DeFi boom, with its loans sky-rocketing by nearly 7,000 per cent since June to $1.4bn, DeFi Pulse data shows.

These are experiments in finance. They're not necessarily legally compliant in a lot of cases. But that doesn't mean that they can't be at some [point in the] future.

Preston Byrne, Anderson Kill law firm

Stani Kulechov, founder of the platform, says user interest had been "enormous" in recent months but he acknowledges the pitfalls of the fledgling lending industry.

Mr Kulechov says the code that underpinned DeFi lending was capable of regulating itself without the need for oversight by centralised bodies like financial regulators but only as long as it worked correctly.

"The problem is when smart contracts behave in a way that they shouldn't, and when things go wrong."

However failures in code from bugs to hacks are common.

On March 12 this year, for example, major DeFi lending platform Maker, with about $1.4bn of loans, was rocked by a sudden drop in the price of ethereum and about 1,200 lenders saw their positions suddenly liquidated for virtually nothing, despite safeguards put in place by Maker to protect lenders against sudden market falls.

Some industry players, such Aave's Mr Kulechov, advocate self-regulation by platforms to create standards for smart contracts, aiming to prevent hacks or malfunctioning code.

The DeFi industry, however, is still far from that point.

Many purists are opposed to any oversight by humans or institutions, preferring to put faith in communities of users improving smart contracts, ironing out bugs through open-source programming.

More immediately, some users are turning to a more traditional industry for a degree of protection from DeFi platform failures: insurance. Some firms, such as London-based Nexus Mutual, offer coverage specifically against failures in smart contracts.

Britain's financial watchdog says it regulated some crypto-related activities, looking at them on a case-by-case basis. Even "decentralised" platforms may be subject to regulation, it said separately last year. US securities regulators did not respond to requests for comment.

Until regulation catches up, critics say, the risks of relying on the code may outweigh rewards.

"The people that lose out have no recourse," says Tim Swanson of blockchain payments firm Clearmatics.

"Code is not law."

Updated: August 31, 2020 03:49 PM

See original here:
Welcome to the freewheeling world of crypto lending - The National

New open source API bug detection tool improves application security testing – BetaNews

Software development today usually involves the use of third-party APIs, libraries or frameworks that are complex, rapidly evolving, and sometimes poorly documented.

Security testing solutions company GrammaTech is launching its new Swap Detector, an open-source checker that detects application programming interface (API) usage errors.

Developed as part of a research project sponsored by the Department of Homeland Security Science and Technology Directorate Static Tool Analysis Modernization Project (STAMP), Swap Detector improves application security testing for DevOps teams.

"Traditional static-analysis techniques do not take advantage of the vast wealth of information on what represents error-free coding practices available in the open-source domain," says Alexey Loginov, vice president of research at GrammaTech. "With Swap Detector we applied Big Data analysis techniques, what we call Big Code analysis, to the Fedora RPM open-source repository to baseline correct API usage. This allowed us to develop error-detection capabilities that far exceed the scalability and accuracy of conventional approaches to program analysis."

The Swap Detector interface integrates with a variety of static analysis tools and although initially focused on C/C++ programs, it's applicable to programs in other languages and is especially beneficial for languages that are interpreted and not compiled.

It uses multiple error-detection techniques, layered together to increase accuracy. For example, comparing argument names used in call sites with the parameter names used in corresponding declarations. In addition, it uses 'Big Code' techniques, applying statistical information about usages of known good API-usage patterns collected from a large sample of code and flagging usages that are statistically anomalous as potential errors. To improve the precision of the reported warnings, Swap Detector also applies false-positive reduction strategies to the output of both techniques.

Swap Detector is available now via GitHub.

Photo Credit: Panchenko Vladimir/Shutterstock

Continued here:
New open source API bug detection tool improves application security testing - BetaNews

Boom or bust? Welcome to the freewheeling world of crypto lending – Reuters

LONDON (Reuters) - It sounds like a surefire bet. You lend money to a borrower who puts up collateral that exceeds the size of the loan, and then you earn interest of about 20%. What could possibly go wrong?

FILE PHOTO: Representation of the Ethereum virtual currency standing on the PC motherboard is seen in this illustration picture, February 3, 2018. REUTERS/Dado Ruvic/Illustration

Thats the proposition presented by DeFi, or decentralised finance, peer-to-peer cryptocurrency platforms that allow lenders and borrowers to transact without the traditional gatekeepers of loans: banks.

And it has exploded during the COVID-19 crisis.

Loans on such platforms have risen more than seven-fold since March to $3.7 billion, according to industry site DeFi Pulse, as investors hunt returns at a time when central banks across the world have slashed interest rates to prop up economies battered by the pandemic.

Proponents say DeFi sites, which run on open-source code with algorithms that set rates in real-time based on supply and demand, represent the future of financial services, providing a cheaper, more efficient and accessible way for people and companies to access and offer credit.

But with the promise of high rewards comes high risk.

Lawyers and analysts say such sites are vulnerable to coding bugs and hacks, and most are untested at scale and unregulated - the latter typical of much of a global cryptocurrency sector mistrustful of the financial establishment.

Critics warn the technology could be the next overblown bubble of the crypto world, akin to initial coin offerings (ICOs), with inexperienced investors at particular risk. In 2017, billions of dollars poured into ICOs, where companies raised capital by issuing new virtual coins. Most projects failed to gain traction, and many investors lost their money.

These are experiments in finance, said Preston Byrne of law firm Anderson Kill in New York. Theyre not necessarily legally compliant in a lot of cases, he added. But that doesnt mean that they cant be at some future.

DeFi is nonetheless surging in popularity.

Seven years ago, Brice Berdah dreamt of retiring in his mid-30s. He worked out what he would need to save: The exact amount was 1.7 million euros. My plan was to make 5% on my capital.

Reality, though, scuppered his plans. Low interest rates meant his savings stagnated, while enquiries into real estate and car-parking businesses came to naught.

By 27, I had only saved only about 0.5% of the required amount, said Berdah, who works at a startup that makes digital wallets for storing digital coins. It was an obvious failure.

To resurrect his dream Berdah, now 28, has turned to DeFi.

Now Im using DeFi, Ive readjusted my retirement plans, said Paris-based Berdah, who has bet 90% of his net worth on DeFi. Returns are about 20-25% over the last six months ... and Im on track just now.

While DeFis roots are in a crypto sector hostile to mainstream finance, some of its aims - like cutting costly steps and paperwork in financing - have caught the attention of the firms it seeks to undermine.

In the future, backers say, bonds or stocks will be issued and traded directly on their blockchain-based platforms instead of by investment banks or centralised exchanges. Code, not humans, will oversee the processes, they say.

For their part, major banks are looking at how such technology can be used to complement, rather than upend, established finance. Goldman Sachs, for example, has hired a new head of digital assets to look at how assets can exist on blockchain technology, a spokesman said earlier this month.

There is an actual value on what is being built on these protocols, said Maya Zehavi, a blockchain consultant and board member of an Israeli blockchain industry group. It might end up being an instant financialisation ecosystem for any project. Thats the promise.

Most DeFi platforms are based on the ethereum blockchain, the backbone for ether, the second-biggest cryptocurrency after bitcoin. Unlike bitcoin, ethereums blockchain can be used to create digital contracts, while developers can more easily build new software or apps on it.

Loans are recorded, issued and managed by the blockchain-based contracts. Borrowers must offer collateral, also in cryptocurrency, usually worth more than the loans they take out.

DeFi is not for the faint-hearted. Borrowers are typically traders who take out loans in say, ethereum, then use the coins to trade on various exchanges against other cryptocurrencies. They then aim to pay back the loan and pocket their profits, comparable to short-sellers in stock markets.

One such borrower is Antoine Mouran, a computer science student at university in Lausanne.

Mouran borrows the USD Coin cryptocurrency on lending platform Aave, and then uses the loan to trade Lend coins.

The profits on a typical trade? Depending on the starting price, they can reach 30%, Mouran said.

My portfolio is a couple of thousands dollars, the 18-year-old added. I trade for fun, to discover new technologies such as decentralised finance.

For a graphic on Boomtime for crypto lending:

here

Aave has been a big beneficiary of the recent DeFi boom, with its loans sky-rocketing by nearly 7,000% since June to $1.4 billion, the DeFi Pulse data shows.

Stani Kulechov, founder of the platform, said user interest had been enormous in recent months - but he acknowledges the pitfalls of the fledgling lending industry.

Kulechov said the code that underpinned DeFi lending was capable of regulating itself without the need for oversight by centralised bodies like financial regulators - but only as long as it worked correctly.

The problem is when smart contracts behave in a way that they shouldnt, and when things go wrong.

However failures in code - from bugs to hacks - are common.

On Mar. 12, for example, major DeFi lending platform Maker, with about $1.4 billion of loans, was rocked by a sudden drop in the price of ethereum.

Around 1,200 lenders saw their positions suddenly liquidated for virtually nothing, despite safeguards put in place by Maker to protect lenders against sudden market falls.

Some industry players, like Aaves Kulechov, advocate self-regulation by platforms to create standards for smart contracts, aiming to prevent hacks or malfunctioning code.

The DeFi industry is still far from that point, though.

Many purists are opposed to any oversight by humans or institutions, preferring to put faith in communities of users improving smart contracts, ironing out bugs through open-source programming.

More immediately, some users are turning to a more traditional industry for a degree of protection from DeFi platform failures: insurance. Some firms, such as London-based Nexus Mutual, offer coverage specifically against failures in smart contracts.

Britains financial watchdog told Reuters it regulated some crypto-related activities, looking at them on a case-by-case basis. Even decentralised platforms may be subject to regulation, it said separately last year. U.S. securities regulators did not respond to requests for comment.

Until regulation catches up, critics say, the risks of relying on the code may outweigh rewards.

The people that lose out have no recourse, said Tim Swanson of blockchain payments firm Clearmatics.

Code is not law.

Reporting by Tom Wilson; Editing by Pravin Char

More:
Boom or bust? Welcome to the freewheeling world of crypto lending - Reuters

Tachyum Prodigy Native AI Supports TensorFlow and PyTorch – HPCwire

SANTA CLARA, Calif., August 26, 2020 Tachyum Inc. announced that it has further expanded the capabilities of its Prodigy Universal Processor through support for TensorFlow and PyTorch environments, enabling a faster, less expensive and more dynamic solution for the most challenging artificial intelligence/machine learning workloads.

Analysts predict that AI revenue will surpass $300 billion by 2024 with a compound annual growth rate (CAGR) of up to 42 percent through 2027. AI is being heavily invested in by technology giants looking to make the technology more accessible for enterprise use-cases. They include self-driving vehicles to more sophisticated and control-intensive disciplines like Spiking Neural Nets, Explainable AI, Symbolic AI and Bio AI. When deployed into AI environments, Prodigy is able to simplify software processes, accelerate performance, save energy and better incorporate rich data sets to allow for faster innovation.

Proprietary programming environments like CUDA are inherently hard to learn and use. With open source solutions like TensorFlow and PyTorch, there are a hundred times more programmers that can leverage the frameworks to code for large-scale ML applications on Prodigy. By including support for deep learning environments that are easier to learn, build and train diversified neural networks, Tachyum is able to overcome and move beyond the limitations facing those working exclusively with NVIDIAs CUDA or with OpenCL.

In much the same way that external floating-point coprocessors and vector coprocessor chips have been internalized into the CPU, Tachyum is making external matrix coprocessors for AI an integral part of the CPU. By having integrated matrix operations as part of Prodigy, Tachyum is able to provide high-precision neural network acceleration of up to 10 times faster than other solutions. Tachyums support of 16-bit floating point and lower precision data types improves performance and saves energy in applications, such as video processing. Faster than the NVIDIA A100, Prodigy uses compressed data types to allow larger models to fit in memory. Instead of 20GB shared coherent memory, Tachyum allows 8TB per chip and 64TB per node.

Idle Prodigy-powered universal servers in hyperscale data centers, during off-peak hours, will deliver 10x more AI Neural Network training/inference resources than currently available, CAPEX free (i.e. at low cost, since the Prodigy-powered universal computing servers are already bought & paid for). Tachyums Prodigy enables edge computing and IOT products, which will have an onboard high-performance AI inference optimized to exploit Prodigy-based AI training from either the cloud or the home office.

Business and trade publications are predicting just how important AI will become in the marketplace, with estimates of more than 50 percent of GDP growth coming from it, said Dr. Radoslav Danilak, Tachyum founder and CEO. What that means is that the less than 1 percent of data processed by AI today will grow to as much as 40 percent and the 3 percent of the planets power used by datacenters will grow to 10 percent in 2025. There is an immediate need for a solution that offers low power, fast processing and easy of use and implementation. By incorporating open source frameworks like TensorFlow and PyTorch, we are able to accelerate AI and ML into the world with human-scale computing coming in 2 to 3 years.

Tachyums Prodigy can run HPC applications, convolution AI, explainable AI, general AI, bio AI and spiking neural networks, as well as normal data center workloads on a single homogeneous processor platform with its simple programming model. Using CPU, GPU, TPU and other accelerators in lieu of Prodigy for these different types of workloads is inefficient. A heterogeneous processing fabric, with unique hardware dedicated to each type of workload (e.g. data center, AI, HPC), results in underutilization of hardware resources, and a more challenging programming environment. Prodigys ability to seamlessly switch among these various workloads dramatically changes the competitive landscape and the economics of data centers.

Prodigy significantly improves computational performance, energy consumption, hardware (server) utilization and space requirements compared to existing chips provisioned in hyperscale data centers today. It will also allow Edge developers for IoT to exploit its low power and high performance, along with its simple programming model to deliver AI to the edge.

Prodigy is truly a universal processor. In addition to native Prodigy code, it also runs legacy x86, ARM and RISC-V binaries. And, with a single, highly efficient processor architecture, Prodigy delivers industry-leading performance across data center, AI, and HPC workloads. Prodigy, the companys flagship Universal Processor, will enter volume production in 2021. In April, the Prodigy chip successfully proved its viability with a complete chip layout exceeding speed targets. In August, the processor is able to correctly execute short programs, with results automatically verified against the software model, while exceeding the target clock speeds. The next step is to get a manufactured wholly functional FPGA prototype of the chip later this year, which is the last milestone before tape-out.

Prodigy outperforms the fastest Xeon processors at 10x lower power on data center workloads, as well as outperforming NVIDIAs fastest GPU on HPC, AI training and inference. A mere 125 HPC Prodigy racks can deliver 32 tensor EXAFLOPS. Prodigys 3X lower cost per MIPS and 10X lower core power translates to a 4X lower data center Total Cost of Ownership (TCO), enables billions of dollars of savings for hyperscalers such as Google, Facebook, Amazon, Alibaba, and others. Since Prodigy is the worlds only processor that can switch between data center, AI and HPC workloads, unused servers can be used as CAPEX-free AI or HPC cloud, because the servers have already been amortized.

To see videos of the latest results, please go tohttps://www.tachyum.com/resources

About Tachyum

Tachyum is disrupting data centers, HPC and AI markets by providing universality, industry leading performance, cost and power, while enabling data centers that are more powerful than the human brain. Tachyum, co-founded by Dr. Radoslav Danilak, and its flagship product Prodigy, the worlds first and only universal processor, begins production in 2021. Prodigy brings unprecendeted value targeting a $50B market that is growing at 20% per year. With data centers currently consuming over 3% of the planets electricity, and 10% by 2025, low power Prodigy is critical for the continued doubling of worldwide data center capacity every 4 years. Tachyum has offices in the USA and Slovakia, EU.

Source: Tachyum

Originally posted here:
Tachyum Prodigy Native AI Supports TensorFlow and PyTorch - HPCwire

Dive Into Java Programming With 10 Hours of Training for $35.99 – Best gaming pro

Weve been eagerly awaiting extra particulars on Sonos wi-fi headphone plans ever since Bloomberg let or not its recognized in January 2019 that the multiroom audio firm was actively growing them. Now, nearly two years later, we have now our first have a look at the designs because of a recently awarded patent from the USA Patent Workplace (USPO).

Buried inside the 45-page patent doc are drawings that illustrate two potential headphone designs one which makes use of a single-sided earcup fork, much like a Bowers and Wilkins PX7 or Microsoft Floor Headphones 2, and one other that makes use of what seems to be a forkless strategy that reminds us quite a lot of the Bose Noise Cancelling Headphones 700.

The patent additionally discusses a number of options, which give a robust sense of how the brand new designs might work preserving in thoughts that patents are merely potential outcomes, not ensures of what an organization will launch.

Its clear that Sonos sees the headphones as being integral components of a Sonos whole-home sound system. Theyll have the ability to function over Bluetooth and Wi-Fi, similar to the Sonos Transfer, and also youll have the ability to stream audio to them from a smartphone, from one other Sonos product that has built-in Apple AirPlay, just like the Sonos Beam, or from a TV if its linked to a Sonos soundbar just like the Arc.

However it is going to additionally similar to each different Sonos gadget have the ability to entry streaming companies like Spotify and Apple Music instantly, with out the necessity for a continuing connection to a smartphone or pill. This half would require Wi-Fi, and would probably drain battery life quicker than utilizing Bluetooth. Curiously, the patent additionally makes room for the chance that the headphones might have onboard storage for media recordsdata, one thing no different Sonos gadget at the moment presents.

One other characteristic the described headphones might supply is hands-free entry to a voice assistant, utilizing only a wake-word like Alexa or Hey, Google. Sonos already presents its prospects the flexibility to make use of both Amazon or Googles voice assistant on this approach by way of its voice-enabled merchandise, which now embrace the Arc, Beam, One, and Transfer. This is able to additionally require Wi-Fi except the headphones have been getting used with a smartphone over Bluetooth.

Talking of Bluetooth, weve lengthy puzzled if a set of Sonos wi-fi headphones would work each inside and outdoors the house. That feels like a dumb query, however the cause for the doubt is well-founded. The Sonos app, which controls your entire Sonos expertise at dwelling, doesnt allow you to management the Sonos Transfer when exterior the house.

In these situations, the Transfer isnt any totally different than another Bluetooth speaker, which implies you have to select one other app, like Apple Music or Spotify or Tidal, to manage content material and playback.

However the patented Sonos headphone design does embrace a approach of transferring playback periods from the headphones to different Sonos units and again once more. This is likely to be used to proceed listening periods that happen exterior the house when you get again inside or vice versa, as Protocol points out.

In lots of respects, these headphones will work similar to different top-tier wi-fi headphones like Sonys WH-1000XM4. There will likely be a method to management play/pause, quantity up/down, and observe skip ahead/again. These will probably be contact controls of some variety, however may be bodily buttons. The patent suggests there may be loads of sensors that detect every part from contact, to voice, as to if or not youre sporting the headphones.

There are references to energetic noise cancellation (ANC) so clearly Sonos anticipates that people will need this very fashionable characteristic on its headphones. One of many designs lets individuals management each quantity and ANC via using bodily knobs on the ends of the scarf stems, much like how Microsofts Floor Headphones 2 use a bodily dial for ANC options.

The patent doesnt embrace any pricing data or when (if ever) Sonos will launch its personal wi-fi headphones. In January 2019, Bloomberg reported that they might price $300. This is able to be on the low aspect of comparably geared up headphones from Bose, Beats, Sony, and Sennheiser, particularly when you think about their distinctive compatibility with the Sonos ecosystem.

When Digital Developments requested Sonos to touch upon the patent, we have been supplied with the next assertion from the corporate: As an organization based in innovation, were at all times engaged on totally different concepts and improvements that may assist the world hear higher. We proceed to make investments in our sturdy patent portfolio with dozens of recent patents annually. We do not need extra data to share presently concerning our future product roadmap.

In different phrases: No remark.

Well preserve you posted as quickly as we hear extra about Sonos wi-fi headphones.

Go here to read the rest:
Dive Into Java Programming With 10 Hours of Training for $35.99 - Best gaming pro

Investing in Tezos (XTZ) – Everything You Need to Know – Securities.io

What is Tezos (XTZ)?

Tezos (XTZ) is a fourth-generation blockchain network that incorporates advanced protocols to enable a host of functionalities. Primarily, the platform supports the development of decentralized applications (DApps) and the coding of smart contracts.

Tezos is an open-source decentralized network for assets and applications. Today, the Tezos community consists of researchers, developers, validators, and various support groups. All of these parties share the common goal to expand the Tezos ecosystem.

Tezos history begins in 2014 when co-founders, Arthur Breitman, and Kathleen Breitman began development on their next-generation blockchain. Specifically, the Breitmans sought to simplify Dapp development and create a unique decentralized ecosystem to cater to the needs of the digital economy.

Tezos officially launched in Switzerland in September 2018. Like many other projects in the sector, Tezos utilized a dual company approach. Specifically, Tezos founding company is Dynamic Ledger Solutions (DLS).

Arthur Breitman and Kathleen Breitman

Additionally, the group utilizes a foundation for its fundraising purposes. This non-profit is known as the Tezos Foundation. Importantly, the Tezos Foundation is the company that holds all the operating funds, including the funds collected during the ICO.

Tezos hit the market running. The firm hosted a record-breaking uncapped ICO in 2018. The event was a major success. It secured $232 million in Bitcoin and ether in just under two weeks. The success of the event made international headlines. It also helped propel Tezos further into the spotlight.

Investors received XTZ for their Bitcoin and Ethereum. XTZ, also called tez or tezzie, is a utility token for the Tezos ecosystem. Users can pay for services and execute smart contracts using XTZ. There are 741,546,948 XTZ in circulation currently.

Tezos never announced the total amount of XTZ the platform plans to release. Developers left this open in a bid to ensure that their platform never reaches its capacity in the market. However, some in the space argue that this lack of scarcity hurts the overall value of the coin.

Tezos ICO success was short-lived. Within weeks, the President of Tezos Foundation, Johann Gevers, and the Breitmans got into a public feud regarding the funds raised. Specifically, Gevers refused to disburse the funds to the Breitmans.

The issue was a huge debacle that caused investors to lose faith in the project. This led to the value of XTZ dropping temporarily. Eventually, Gevers left the project, and the funds made it to their destination. However, Gevers made sure to secure a $400,000 severance package for his troubles.

Tezos is unique in the market for a variety of reasons. For one, it utilizes a Liquid proof-of-stakeconsensus mechanism. Also, the platform introduces an agonistic native-middleware known as a Network Shell. This strategy enables developers to utilize modules during the construction of applications.

Tezos is bilingual meaning it utilizes both an Imperative and Functional language. Imperative languages such as Solidity are ideal for smart contract programming in terms of flexibility Whereas, functional languages are more adept at mathematical reasoning, making them more secure.

Tezos (XTZ) Twitter

Tezos uses the combination to ensure its smart contracts are both robust and secure. Notably, the Tezos ecosystem relies on Ocaml for blockchain programming and Michelson for the coding of smart contracts. This strategy also improves transaction speeds across the network.

Currently, XTZ is capable of around 1000 transactions per second (tps). The limit is based on the max allowed gas per transaction. This rate can also increase in the future via voting on protocol changes such as off-chain scaling solutions.

Tezos offers users some features not available to earlier blockchains. To accomplish this task, Tezos combines its transaction and consensus protocols. This strategy streamlines its processes. Crucially, the combination aids in the communication between the network protocol and the blockchain protocol.

The Liquid PoS consensus mechanism is an upgrade to the Delegated Proof-of-Stake systems found in third-generation blockchains like EOS and NEO. In a DPoS, the community votes on who will function as a delegated node.

Importantly, Delegated nodes approve blocks and add the transactions to the blockchain. Additionally, they have a few more rights and responsibilities in the network. Crucially, the number of delegators allowed depends on the bond size minimum requirement. Currently, this limit allows up to around 70,000 delegators.

The LPoS mechanism is exclusive to Tezos at this time. LPoS in Tezos has proven to be very successful. Currently, the network has a stake rate of approximately 80% spread across 450 validators and 13,000 delegators.This makes Tezos one of the most decentralized blockchains in the sector.

The Liquid PoS offers users more control compared to DPoS systems. For one, every user gets a vote. This strategy helps to ensure a more cohesive community. Keenly, users can vote directly or delegate their voting responsibilities to another party.

Additionally, these delegates can then delegate their votes to other delegates via a process known as transitivity. Notably, users can choose to regain their voting rights at any time. They can even change their representatives vote whenever there is a topic in which they disagree with their decision.

The Liquid PoS consensus mechanism provides a balanced and inclusive approach to decentralized network security. Each person has a vote that counts in the final approval of network changes. Best of all, anyone can become a delegate for free. You just need to gain the respect of the community.

To participate in the process a user simply needs to stake their XTZ in a network wallet. In the Tezos ecosystem, this process is called baking. The more XTZ you cake, the better the chances you get to add the next block.

After the block bakes successfully, the network will have 32 random other bakers repeat the process. Once this process is complete, the baker receives a reward. Best of all, the baker gains the ability to charge transaction fees on all the transactions within the block.

The Tezos system mitigates the chance of hard forks via this decentralized voting mechanism. Developers took extra care to ensure that the network has the capabilities to upgrade passively in a decentralized manner via self-amendments. In this way, Tezos seeks to keep its community focused on the same goals.

The voting process begins when a developer submits an upgrade proposal. The proposal must include the technical aspects of the upgrade. Also, it must include the compensation required by the developer for their efforts.

Tezos CoinMarketCap

From here, the protocol will go before the community. The community will test the protocol and give valuable feedback as to its merits. Notably, every protocol undergoes multiple testing periods. In this way, Tezos ensure that only top quality coding makes it onto the blockchain.

Following the completion of the testing period, Tezos token holders can vote on the upgrade directly. If approved, the protocol upgrade will integrate into the network via a protocol called a hot swap. Additionally, the developer will receive compensation from the Tezos Foundation for their efforts at this time.

The Tezos ecosystem provides you with the ability to operate under two different account types. These accounts go by the names Implicit Accounts and Originated Accounts. Critically, these accounts serve different purposes within Tezos infrastructure. In most cases, an Implicit Account will work for basic functionalities.

Implicit Accounts are the type of account that most users posses. These addresses function similar to traditionally crypto accounts. Each Implicit Account includes both public and private keys. Users can check their balance and transfer funds to and from this address.

Originated Accounts are what developers utilize for smart contracts. They differ from Implicit Accounts in a couple of key ways. For one, these accounts always begin with a KT1 versus a tz1. All Originated Accounts include a Manager, Amount, Delegatable, and Delegate Field options.

Tezos is available on most major exchanges today. Binance, the worlds largest exchange, offers multiple Tezos trading pairs. To get started you just need to register for an account. Once your account verification period is over, you can fund your account with fiat currency.

Once your account has funds, it only takes a second to transfer these funds over to Bitcoin or Ethereum. From here, you will want to exchange for XTZ. The entire process can be done in under ten minutes after your account verification completes.

Social Trading

Copy Successful Traders

Biggest Exchange

Fast Executions due to Liquidity

Tokens Offered

Premium Tokens Listed First

Beginners

Simple to Use Trading Platform

Day Traders

150+ Tokens, Fast Software

Advanced Traders

Staking, futures, & more.

10% Cashback

Discount Code: EE59L0QP

Free Airdrops

Exclusive to Securities.io

Storing Tezos is easy. If you are new to the space, you can download a reliable mobile wallet in seconds. The top mobile wallets for this coin are Kukai Tezos Wallet and TezBox Wallet. Both are free to download and provide you with an easy-to-navigate interface.

If you intend to invest significant funds into the project, you should consider a hardware wallet. Manufacturers such as Trezor or Ledger, they both provide high-quality devices for around $100. These devices keep your crypto safely stored offline in cold storage.

Now that Tezos has overcome its inner-company related issues, the firm is ready to take its platform mainstream. Today, Tezos has one of the largest followings in the market. Consequently, you can expect to see Tezos in the top twenty cryptocurrencies for years to come.

Follow this link:
Investing in Tezos (XTZ) - Everything You Need to Know - Securities.io

How Artificial Intelligence is changing the way we search and find online – ITProPortal

Can a machine know a customer better than they know themselves? The answer is, for the purposes of shopping, yes it can.

First of all, Artificial Intelligence (AI) takes a dispassionate view of customers and their behavior online, while in research, consumers will often give contradictory answers, answers that then change over time, depending largely on how they are feeling at that particular moment. As an indicator of how those consumers are then likely to behave in terms of what they buy, this has been proven to be unreliable.

AI on the other hand, supported by machine learning to deliver better and better outcomes over time, operates without emotions and simply reacts to and learns from what it is being told.

In online retail, AI is set to revolutionize the world of search. If revolutionize sounds too big a word for it, bear in mind that search technology has barely changed in 10 or more years. While brands have invested heavily in making their websites look amazing and optimized them to steer the customer easily to the checkout, they have generally used out of the box search technology ever since the first commercial engine was launched by Altavista back in 1995.

Given that typical conversion rates on retail websites is 2-3 percent, then there is everything to play for in making search easier and more rewarding for shoppers. Retailers invest heavily in SEO and PPC to get customers from Google to their site but too often think the job is done once they get there.

Products are then displayed to their best advantage on the site; email or newsletter sign up is offered; online chat is offered; promotions pop up; a list of nearby stores is offered; and so on. But at no point is the customer offered or given any help, apart from the online chat window which follows them around.

At this point, the customer may well start to follow the journey laid out for them by the retailer; they get distracted and end up somewhere entirely different from what they intended. Some customers like to wander, but those that already knew what they were looking for do not.

Meanwhile, what has the retailer learned from all the precious time the customer has spent on their site? Only that the customer has not bought anything, and it is only at this point that an offer pops up or the online chat box appears. But none of these actions are based on any knowledge of the customer other than which pages they have looked at.

The search engine is not very good at learning; it may be able to refer the customer back to a page they looked at before because of the consumers digital footprint or due to the cookie the site left behind, but if that webpage was not useful, then the search process has actually gone backwards. So, the customer continues to end up where they never wanted to go in the first place ever decreasing circles displaying a choice of unwanted products.

These on-site search functions can be compared to stubborn school children who simply refuse to learn, whatever they are taught. The customer searching online tries to make their query as accurate and intelligent as possible while the search engine simply responds by sharing everything it knows, but without actually answering the question. AI by contrast can spot what the customer intends and gives answers based on that intent, depending where an individual shopper is in their own personal buying journey.

It then returns increasingly accurate results because it is learning from what the customer is telling it. Search thus becomes understanding because it is looking at behavior not just keywords, which is the current limit of conventional search engines. The AI can also create the best shopping experience beyond basic search, including navigation, to seamlessly and speedily advance a customer to the checkout.

This is really what delivering personalized journeys is all about the site understands the customer, knows what they want and how they want it. For instance, when a shopper is very clear about what they want, the AI can plot the quickest route through the site to the payment page, while customers looking for inspiration can be given a slower and more immersive experience, with lots of hand-holding as required, such as links to online chat to help them with their decision or curated content to inspire browsing.

AI in ecommerce assumes a character all of its own, essentially a digital assistant that is trusted by the customer to help them find what they want. Retailers can personalize AI in any way they choose, while the processing and intelligence that sits behind it continues to work unseen.

AI in action of course creates a huge amount of interactional and behavioral data that the retailer can use to make improvements over time to base search, navigation, merchandising, display, promotions and checkout experience. It delivers good results for individual customers as well as all customers as their online behavior continues to evolve.

Our view is that customers want help when they are on a website. They want to be able to ask questions using natural rather than search language and they want the search function to learn based on those answers. By ensuring that their search strategy is underpinned by AI, retailers can then introduce more dynamic search enablers, such as visual and voice. But rather than simply adding commands, the customer is able to hold conversations with the digital assistant using natural language. Search then turns into discovery and it is this that leads to higher customer conversions, repeat visits and long-term loyalty.

To date, a lot of the conversation around AI has focused on the technology rather than what it enables in the real world. And there has been some reticence to adopt it for fear that it will replace human jobs; however, in the case of online search, one automated process is simply complementing another and all in all, doing a much better job. Check out your own search function now. How is that working for you?

Jamie Challis is UK Director, Findologic

View post:
How Artificial Intelligence is changing the way we search and find online - ITProPortal

Rage Against the Algorithm: the Risks of Overestimating Military Artificial Intelligence – Chatham House

AI holds the potential to replace humans for tactical tasks in military operations beyond current applications such as navigation assistance. For example, in the US, the Defense Advanced Research Projects Agency (DARPA) recently held the final round of its AlphaDogfight Trials where an algorithm controllinga simulated F-16 fighter was pitted against an Air Force pilot in virtual aerial combat. The algorithm won by 5-0. So what does this mean for the future of military operations?

The agencys deputy director remarked that these tools are now ready for weapons systems designers to be in the toolbox. At first glance, the dogfight shows that an AI-enabled air combat would provide tremendous military advantage including the lack of survival instincts inherent to humans, the ability to consistently operate with high acceleration stress beyond the limitations of the human body and high targeting precision.

The outcome of these trials, however, does not mean that this technology is ready for deployment in the battlefield. In fact, an array of considerations must be taken into account prior to their deployment and use namely the ability to adapt in real-life combat situations, physical limitations and legal compliance.

First, as with all technologies, the performance of an algorithm in its testing environment is bound to differ from real-life applications such as in the case of cluster munitions. For instance, Google Health developed an algorithm to help with diabetic retinopathy screening. While the algorithms accuracy rate in the lab was over 90 per cent, it did not perform well out of the lab because the algorithm was used to high-quality scans in its training, it rejected more than a fifth of the real-life scans which were deemed as being below the quality threshold required. As a result, the process ended up being as time-consuming and costly if not more so than traditional screening.

Similarly, virtual environments akin to the AlphaDogfight Trials do not reflect the extent of risks, hazards and unpredictability of real-life combat. In the dogfight exercise, for example, the algorithm had full situational awareness and was repeatedly trained to the rules, parameters and limitations of its operating environment. But, in a real-life dynamic andbattlefield, the list of variables is long and will inevitably fluctuate: visibility may be poor, extreme weather could affect operations and the performance of aircraft and the behaviour and actions of adversarieswill be unpredictable.

Every single eventuality would need to be programmed in line with the commanders intent in an ever-changing situation or it would drastically affect the performance of algorithms including in target identification and firing precision.

Another consideration relates to the limitations of the hardware that AI systems depend on. Algorithms depend on hardware to operate equipment such as sensors and computer systems each of which are constrained by physical limitations. These can be targeted by an adversary, for example, through electronic interference to disrupt the functioning of the computer systems which the algorithms are operating from.

Hardware may also be affected involuntarily. For instance, a pilotless aircraft controlled by an algorithm can indeed undergo higher accelerations, and thus, higher g-force than the human body can endure. However, the aircraft in itself is also subject to physical limitations such as acceleration limits beyond which parts of the aircraft, such as its sensors, may be severely damaged which in turn affects the algorithms performance and, ultimately, mission success. It is critical that these physical limitations are factored into the equation when deploying these machines especially when they so heavily rely on sensors.

Another major, and perhaps the greatest, consideration relates to the ability to rely on machines for legal compliance. The DARPA dogfight exclusively focused on the algorithms ability to successfully control the aircraft and counter the adversary, however, nothing indicates its ability to ensure that strikes remain within the boundaries of the law.

In an armed conflict, the deployment and use of such systems in the battlefield are not exempt from international humanitarian law (IHL) and most notably its customary principles of distinction, proportionality and precautions in attack. It would need to be able to differentiate between civilians, combatants and military objectives, calculate whether its attacks will be proportionate against the set military objective and live collateral damage estimates and take the necessary precautions to ensure the attacks remain within the boundaries of the law including the ability to abort if necessary. This would also require the machine to have the ability to stay within the rules of engagement for that particular operation.

It is therefore critical to incorporate IHL considerations from the conception and throughout the development and testing phases of algorithms to ensure the machines are sufficiently reliable for legal compliance purposes.

It is also important that developers address the 'black box' issue whereby the algorithms calculations are so complex that it is impossible for humans to understand how it came to its results. It is not only necessary to address the algorithms opacity to improve the algorithms performance over time, it is also key for accountability and investigation purposes in cases of incidents and suspected violations of applicable laws.

Algorithms are becoming increasingly powerful and there is no doubt that they will confer tremendous advantages to the military. Over-hype, however, must be avoided at the expense of the machines reliability on the technical front as well as for legal compliance purposes.

The testing and experimentation phases are key during which developers will have the ability to fine-tune the algorithms. Developers must, therefore, be held accountable for ensuring the reliability of machines by incorporating considerations pertaining to performance and accuracy, hardware limitations as well as legal compliance. This could help prevent incidents in real life that result from overestimating of the capabilities of AI in military operations.

Visit link:
Rage Against the Algorithm: the Risks of Overestimating Military Artificial Intelligence - Chatham House

Banks arent as stupid as enterprise AI and fintech entrepreneurs think – TechCrunch

Announcements like Selina Finances $53 million raise and another $64.7 million raise the next day for a different banking startup spark enterprise artificial intelligence and fintech evangelists to rejoin the debate over how banks are stupid and need help or competition.

The complaint is banks are seemingly too slow to adopt fintechs bright ideas. They dont seem to grasp where the industry is headed. Some technologists, tired of marketing their wares to banks, have instead decided to go ahead and launch their own challenger banks.

But old-school financiers arent dumb. Most know the buy versus build choice in fintech is a false choice. The right question is almost never whether to buy software or build it internally. Instead, banks have often worked to walk the difficult but smarter path right down the middle and thats accelerating.

Thats not to say banks havent made horrendous mistakes. Critics complain about banks spending billions trying to be software companies, creating huge IT businesses with huge redundancies in cost and longevity challenges, and investing into ineffectual innovation and intrapreneurial endeavors. But overall, banks know their business way better than the entrepreneurial markets that seek to influence them.

First, banks have something most technologists dont have enough of: Banks have domain expertise. Technologists tend to discount the exchange value of domain knowledge. And thats a mistake. So much abstract technology, without critical discussion, deep product management alignment and crisp, clear and business-usefulness, makes too much technology abstract from the material value it seeks to create.

Second, banks are not reluctant to buy because they dont value enterprise artificial intelligence and other fintech. Theyre reluctant because they value it too much. They know enterprise AI gives a competitive edge, so why should they get it from the same platform everyone else is attached to, drawing from the same data lake?

Competitiveness, differentiation, alpha, risk transparency and operational productivity will be defined by how highly productive, high-performance cognitive tools are deployed at scale in the incredibly near future. The combination of NLP, ML, AI and cloud will accelerate competitive ideation in order of magnitude. The question is, how do you own the key elements of competitiveness? Its a tough question for many enterprises to answer.

If they get it right, banks can obtain the true value of their domain expertise and develop a differentiated edge where they dont just float along with every other bank on someones platform. They can define the future of their industry and keep the value. AI is a force multiplier for business knowledge and creativity. If you dont know your business well, youre wasting your money. Same goes for the entrepreneur. If you cant make your portfolio absolutely business relevant, you end up being a consulting business pretending to be a product innovator.

So are banks at best cautious, and at worst afraid? They dont want to invest in the next big thing only to have it flop. They cant distinguish whats real from hype in the fintech space. And thats understandable. After all, they have spent a fortune on AI. Or have they?

It seems they have spent a fortune on stuff called AI internal projects with not a snowballs chance in hell to scale to the volume and concurrency demands of the firm. Or they have become enmeshed in huge consulting projects staggering toward some lofty objective that everyone knows deep down is not possible.

This perceived trepidation may or may not be good for banking, but it certainly has helped foster the new industry of the challenger bank.

Challenger banks are widely accepted to have come around because traditional banks are too stuck in the past to adopt their new ideas. Investors too easily agree. In recent weeks, American challenger banks Chime unveiled a credit card, U.S.-based Point launched and German challenger bank Vivid launched with the help of Solarisbank, a fintech company.

Traditional banks are spending resources on hiring data scientists too sometimes in numbers that dwarf the challenger bankers. Legacy bankers want to listen to their data scientists on questions and challenges rather than pay more for an external fintech vendor to answer or solve them.

This arguably is the smart play. Traditional bankers are asking themselves why should they pay for fintech services that they cant 100% own, or how can they buy the right bits, and retain the parts that amount to a competitive edge? They dont want that competitive edge floating around in a data lake somewhere.

From banks perspective, its better to fintech internally or else theres no competitive advantage; the business case is always compelling. The problem is a bank is not designed to stimulate creativity in design. JPMCs COIN project is a rare and fantastically successful project. Though, this is an example of a super alignment between creative fintech and the bank being able to articulate a clear, crisp business problem a Product Requirements Document for want of a better term. Most internal development is playing games with open source, with the shine of the alchemy wearing off as budgets are looked at hard in respect to return on investment.

A lot of people are going to talk about setting new standards in the coming years as banks onboard these services and buy new companies. Ultimately, fintech firms and banks are going to join together and make the new standard as new options in banking proliferate.

So, theres a danger to spending too much time learning how to do it yourself and missing the boat as everyone else moves ahead.

Engineers will tell you that untutored management can fail to steer a consistent course. The result is an accumulation of technical debt as development-level requirements keep zigzagging. Laying too much pressure on your data scientists and engineers can also lead to technical debt piling up faster. A bug or an inefficiency is left in place. New features are built as workarounds.

This is one reason why in-house-built software has a reputation for not scaling. The same problem shows up in consultant-developed software. Old problems in the system hide underneath new ones and the cracks begin to show in the new applications built on top of low-quality code.

So how to fix this? Whats the right model?

Its a bit of a dull answer, but success comes from humility. It needs an understanding that big problems are solved with creative teams, each understanding what they bring, each being respected as equals and managed in a completely clear articulation on what needs to be solved and what success looks like.

Throw in some Stalinist project management and your probability of success goes up an order of magnitude. So, the successes of the future will see banks having fewer but way more trusted fintech partners that jointly value the intellectual property they are creating. Theyll have to respect that neither can succeed without the other. Its a tough code to crack. But without it, banks are in trouble, and so are the entrepreneurs that seek to work with them.

Read the original here:
Banks arent as stupid as enterprise AI and fintech entrepreneurs think - TechCrunch