Bitcoin (BTC) Could Gain Even More This Year Than in 2019, Says Fundstrat’s Tom Lee – U.Today

Fundstrat's Tom Lee reiterated his earlier prediction about Bitcoin (BTC) posting bigger gains in 2020 than in 2019 during his most recent appearance on Yahoo! Finance.

Must Read

Lee claimed that the combination of "elevated" geopolitical tensions, the upcoming reward halving, and institutional money would form a perfect setup for Bitcoin's next rally in 2020.

As reported by U.Today, Fundstrat concluded that there was a strong probability that theprice of Bitcoin could surge by more than 100 percent in 2020, which means that BTC is expected to close this year at least at $14,000 by the famed Wall Street analyst.

Back in July 2019, Lee claimed that BTC could touch up to $40,000 by Q4 2019, which was an extremely inaccurate prediction. However, due to the aforementioned headwinds, the permabull might finally be spot-on this year.

Must Read

For those who do not want to directly buy Bitcoin but still want exposure to the volatile asset class, Lee recommends taking a look at companies like Square. The Jack Dorsey-helmed payments giant introduced Bitcoin deposits for its Cash App in June 2018.

Lee also mentioned Barry Silbert'sGrayscale Bitcoin Trust (GBTC)that can be accessed by many US investors. Grayscale's coffers increased by $600 mln in 2019 with 71 percent of this sum being attributed to institutional investors.

The rest is here:
Bitcoin (BTC) Could Gain Even More This Year Than in 2019, Says Fundstrat's Tom Lee - U.Today

4 Bitcoin Mixers for the Privacy Conscious – Bitcoin News

In an era of unprecedented global surveillance, it is unreasonable to expect the blockchain world to be any different. It is perfectly reasonable, though, to resist this surveillance through countermeasures that thwart the would-be surveillers. Digital privacy is a right that everyone is entitled to. Thanks to the provision of bitcoin mixers, you can claim that entitlement by shuffling your coins and emerging with untainted crypto whose origins have been obfuscated.

Also read: How Dropgangs and Dead Drops Are Transforming Darknet Practices

Just as using Tor doesnt give you internet anonymity, bitcoin mixing alone doesnt grant you automatic privacy. It helps, but only when used in conjunction with other privacy enhancing techniques, like not using exchanges that enforce KYC, and not recombining your freshly mixed UTXOs, thereby undoing all your hard work. News.Bitcoin.com will examine ways to enhance your anonymity when using mixing services in the near future. For now, just know that the following mixing services are not a silver bullet for privacy. When used as their developers recommend, however, they can significantly enhance the fungibility of your coins.

Bitcoin Mixer does exactly what its name sounds like, but it also does a lot more. In addition to mixing up your BTC, the service can do the same with LTC and ETH, providing privacy for three of the most popular cryptos. Its a custodial service, which generally means you can mix larger amounts of coins than with a noncustodial service, where youre reliant on your peers to provide privacy in numbers. Using the service is simple: enter the address youd like your shuffled coins to be sent to, and drag the slider to select your desired mixing time, ranging from 30 minutes to 20 hours. The longer youre prepared to wait, the greater the degree of anonymity you can expect. The platform charges a fee of 2-5%.

One for the BCH brigade, Cashshuffle provides noncustodial mixing of bitcoin cash. Its fully decentralized, and operates by mixing the UTXOs in your BCH wallet with those of other Cashshuffle users. Over $40 million of BCH has been mixed through Cashshuffle, which is compatible with wallets such as Electron Cash. If youre new to the world of bitcoin cash mixing, news.Bitcoin.com has published a detailed guide to using the service. Theres also plans for a Tor-integrated version of the service, known as Cashfusion, which will further diminish the ability for blockchain forensics firms to profile BCH users.

Most noncustodial BTC and BCH mixers are based on implementations of Coinjoin, a trustless method for combining bitcoin payments from multiple users into a single transaction, masking their origin. Cashshuffle is based on Coinjoin, and so are the two most popular wallet-integrated BTC mixers Whirlpool and Wasabi. The former is developed by Samourai Wallet, and enables users of the noncustodial wallet to mix their UTXOs with others through selecting from one of three pools of varying sizes: 0.01, 0.05, and 0.5 BTC. If you have 1 BTC to mix, for example, select the 0.5 BTC pool and your UXTOs will be sent through in two cycles, until all of your coins have been cleaned. The Whirlpool fee remains the same whether youre mixing one coin or 10, making Samourais Whirlpool Coinjoin implementation cost-effective. Its also fast.

Samourai and Wasabi are engaged in a dispute over whose mixing service provides greater anonymity. Samourai appears to have the upper hand at present, but that doesnt mean you should write off Wasabi its an excellent noncustodial BTC wallet for the privacy-conscious, and its integrated Chaumian Coinjoin mixing service is continually improving. The Plustoken scammers famously tried to wash thousands of BTC through Wasabi and failed due to the size of their transactions, which dwarfed those of all other users combined. For regular users seeking to mix modest amounts of BTC, greater anonymity and less scrutiny should be assured, making Wasabi perfect for everyday use.

Whether youre planning to use a custodial or noncustodial bitcoin mixer, do your homework, read some reviews, and familiarize yourself with its workings. Then, after successfully mixing your first set of UTXOs, make it a point of habitually repeating the exercise with new coins that come into your possession. Think of it as cleaning your digital house. In the process, youll also be enhancing the anonymity of your fellow coinjoiners.

What bitcoin mixers do you recommend? Let us know in the comments section below.

Disclaimer: This article is for informational purposes only. It is not an offer or solicitation of an offer to buy or sell, or a recommendation, endorsement, or sponsorship of any products, services, or companies. Bitcoin.com does not provide investment, tax, legal, or accounting advice. Neither the company nor the author is responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.

Images courtesy of Shutterstock.

Did you know you can verify any unconfirmed Bitcoin transaction with our Bitcoin Block Explorer tool? Simply complete a Bitcoin address search to view it on the blockchain. Plus, visit our Bitcoin Charts to see whats happening in the industry.

Kai's been manipulating words for a living since 2009 and bought his first bitcoin at $12. It's long gone. He's previously written whitepapers for blockchain startups and is especially interested in P2P exchanges and DNMs.

Follow this link:
4 Bitcoin Mixers for the Privacy Conscious - Bitcoin News

Essential AI & Machine Learning Certification Training Bundle Is Available For A Limited Time 93% Discount Offer Avail Now – Wccftech

Machine learning and AI are the future of technology. If you wish to become part of the world of technology, this is the place to begin. The world is becoming more dependent on technology every day and it wouldnt hurt to embrace it like it is. If you resist it, you will just be obsolete and will have trouble surviving. Wccftech is offering an amazing discount offer on the Essential AI & Machine Learning Certification Training Bundle. The offer will expire in less than a week, so avail it right away.

The bundle includes 4 extensive courses on NLP, Computer Vision, Data visualization and Machine Learning. Each course will help you understand the technology world a bit more and you will not regret investing your time and money on this. The courses have been created by experts so, you are in safe hands. Here are highlights of what the Essential AI & Machine Learning Certification Training Bundle has in store for you:

The bundle has been brought to you by GreyCampus. They are known for providing learning solutions to professionals in various fields including project management, data science, big data, quality management and more. They offer different kinds of teaching platforms including e-learning and live-online. All these courses have been specifically designed to meet the markets changing needs.

Original Price Essential AI & Machine Learning Certification Training Bundle: $656Wccftech Discount Price Essential AI & Machine Learning Certification Training Bundle: $39.99

Share Submit

Go here to read the rest:
Essential AI & Machine Learning Certification Training Bundle Is Available For A Limited Time 93% Discount Offer Avail Now - Wccftech

How machine learning and automation can modernize the network edge – SiliconANGLE

If you want to know the future of networking, follow the money right to the edge.

Applications are expected to move from data centers to edge facilities in record numbers, opening up a huge new market opportunity. The edge computing market is expected to grow at a compound annual growth rate of 36.3 percent between now and 2022, fueled by rapid adoption of the internet of things, autonomous vehicles, high-speed trading, content streaming and multiplayer games.

What these applications have in common is a need for near zero-latency data transfer, usually defined as less than five milliseconds, although even that figure is far too high for many emerging technologies.

The specific factors driving the need for low latency vary. In IoT applications, sensors and other devices capture enormous quantities of data, the value of which degrades by the millisecond. Autonomous vehicles require information in real-time to navigate effectively and avoid collisions. The best way to support such latency-sensitive applications is to move applications and data as close as possible to the data ingestion point, therefore reducing the overall round-trip time. Financial transactions now occur at sub-millisecond cycle times, leading one brokerage firm to invest more than $100 million to overhaul its stock trading platform in a quest for faster and faster trades.

As edge computing grows, so do the operational challenges for telecommunications service provider such as Verizon Communications Inc., AT&T Corp. and T-Mobile USA Inc. For one thing, moving to the edge essentially disaggregates the traditional data center. Instead of massive numbers of servers located in a few centralized data centers, the provider edge infrastructure consists of thousands of small sites, most with just a handful of servers. All of those sites require support to ensure peak performance, which strains the resources of the typical information technology group to the breaking point and sometimes beyond.

Another complicating factor is network functions moving toward cloud-native applications deployed on virtualized, shared and elastic infrastructure, a trend that has been accelerating in recent years. In a virtualized environment, each physical server hosts dozens of virtual machines and/or containers that are constantly being created and destroyed at rates far faster than humans can effectively manage. Orchestration tools automatically manage the dynamic virtual environment in normal operation, but when it comes to troubleshooting, humans are still in the drivers seat.

And its a hot seat to be in. Poor performance and service disruptions hurt the service providers business, so the organization puts enormous pressure on the IT staff to resolve problems quickly and effectively. The information needed to identify root causes is usually there. In fact, navigating the sheer volume of telemetry data from hardware and software components is one of the challenges facing network operators today.

A data-rich, highly dynamic, dispersed infrastructure is the perfect environment for artificial intelligence, specifically machine learning. The great strength of machine learning is the ability to find meaningful patterns in massive amounts of data that far outstrip the capabilities of network operators. Machine learning-based tools can self-learn from experience, adapt to new information and perform humanlike analyses with superhuman speed and accuracy.

To realize the full power of machine learning, insights must be translated into action a significant challenge in the dynamic, disaggregated world of edge computing. Thats where automation comes in.

Using the information gained by machine learning and real-time monitoring, automated tools can provision, instantiate and configure physical and virtual network functions far faster and more accurately than a human operator. The combination of machine learning and automation saves considerable staff time, which can be redirected to more strategic initiatives that create additional operational efficiencies and speed release cycles, ultimately driving additional revenue.

Until recently, the software development process for a typical telco consisted of a lengthy sequence of discrete stages that moved from department to department and took months or even years to complete. Cloud-native development has largely made obsolete this so-called waterfall methodology in favor of a high-velocity, integrated approach based on leading-edge technologies such as microservices, containers, agile development, continuous integration/continuous deployment and DevOps. As a result, telecom providers roll out services at unheard-of velocities, often multiple releases per week.

The move to the edge poses challenges for scaling cloud-native applications. When the environment consists of a few centralized data centers, human operators can manually determine the optimum configuration needed to ensure the proper performance for the virtual network functions or VNFs that make up the application.

However, as the environment disaggregates into thousands of small sites, each with slightly different operational characteristics, machine learning is required. Unsupervised learning algorithms can run all the individual components through a pre-production cycle to evaluate how they will behave in a production site. Operations staff can use this approach to develop a high level of confidence that the VNF being tested is going to come up in the desired operational state at the edge.

AI and automation can also add significant value in troubleshooting within cloud-native environments. Take the case of a service provider running 10 instances of a voice call processing application as a cloud-native application at an edge location. A remote operator notices that one VNF is performing significantly below the other nine.

The first question is, Do we really have a problem? Some variation in performance between application instances is not unusual, so answering the question requires a determination of the normal range of VNF performance values in actual operation. A human operator could take readings of a large number of instances of the VNF over a specified time period and then calculate the acceptable key performance indicator values a time-consuming and error-prone process that must repeated frequently to account for software upgrades, component replacements, traffic pattern variations and other parameters that affect performance.

In contrast, AI can determine KPIs in a fraction of the time and adjust the KPI values as needed when parameters change, all with no outside intervention. Once AI determines the KPI values, automation takes over. An automated tool can continuously monitor performance, compare the actual value to the AI-determined KPI and identify underperforming VNFs.

That information can then be forwarded to the orchestrator for remedial action such as spinning up a new VNF or moving the VNF to a new physical server. The combination of AI and automation helps ensure compliance with service-level agreements and removes the need for human intervention a welcome change for operators weary of late-night troubleshooting sessions.

As service providers accelerate their adoption of edge-oriented architectures, IT groups must find new ways to optimize network operations, troubleshoot underperforming VNFs and ensure SLA compliance at scale. Artificial intelligence technologies such as machine learning, combined with automation, can help them do that.

In particular, there have been a number of advancements over the last few years to enable this AI-driven future. They include systems and devices to provide high-fidelity, high-frequency telemetry that can be analyzed, highly scalable message buses such as Kafka and Redis that can capture and process that telemetry, and compute capacity and AI frameworks such as TensorFlow and PyTorch to create models from the raw telemetry streams. Taken together, they can determine in real time if operations of production systems are in conformance with standards and find problems when there are disruptions in operations.

All that has the potential to streamline operations and give service providers a competitive edge at the edge.

Sumeet Singh is vice president of engineering at Juniper Networks Inc., which provides telcos AI and automation capabilities to streamline network operations and helps them use automation capabilities to take advantage of business potential at the edge. He wrote this piece for SiliconANGLE.

Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant enterprise and emerging technology content to you. Thanks!

Support our mission: >>>>>> SUBSCRIBE NOW >>>>>> to our YouTube channel.

Wed also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we dont have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary onSiliconANGLE along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams attheCUBE take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here,please take a moment to check out a sample of the video content supported by our sponsors,tweet your support, and keep coming back toSiliconANGLE.

See more here:
How machine learning and automation can modernize the network edge - SiliconANGLE

Predicting Healthcare Utilization With Applied Machine Learning – AJMC.com Managed Markets Network

On this episode of Managed Care Cast, we speak with John Showalter, MD, chief product officer at Jvion and an internal medicine physician, and Soy Chen, MS, director of data science at Jvion and part of their data science team. We discuss their research about using applied machine learning to predict healthcare utilization based on social determinants of health, appearing in the January 2019 Health IT issue of The American Journal of Managed Care.

They found that the social determinant of health most associated with risk was air quality. In addition, neighborhood in-migration, transportation, and purchasing channel preferences were more telling than ethnicity or gender in determining patients use of resources.

On this episode of Managed Care Cast, we speak to study authors Soy Chen, MS, and John Showalter, MD, about how they sourced data for the algorithm, the technology's impact on the future of healthcare, and privacy concerns raised by artificial intelligence.

Listen above or through one of these podcast services:

iTunes

TuneIn

Stitcher

Spotify

Continued here:
Predicting Healthcare Utilization With Applied Machine Learning - AJMC.com Managed Markets Network

The open source licence debate: what we need to know – Open Source Insider – ComputerWeekly.com

As we have already noted on Computer Weekly Open Source Insider, open source grew, it proliferated and it became something that many previously proprietary-only software vendors embraced as a key means of development.

But the issue of how open source software is licenced is still the stuff of some debate.

Open Source Insider has already looked at the issues relating to dead projects (that are still walking and running) and the need for workable incentivisation models.

Chief operating officer (COO) for GitHub Erica Brescia noted that, from her perspective, she is seeing an increasing tension between open source projects and those that are building services on top of open source, such as cloud vendors with their database services.

Brescia notes that licenses applied to open source projects a decade ago did not consider the possibility of a cloud vendor delivering an as-a-Service SaaS layer using the project without contributing back to it, which is leaving some open companies in a difficult position.

Computer Weeklys Cliff Saran wrote, With friends like AWS, who needs an open source business? and noted thataNew York Timesarticle suggested that Amazon Web Services (AWS) was strip-mining open source projects by providingmanaged services based on open source code,without contributing back to the community.

We have also looked at the security aspects of open source licencing.

Exec VP at software intelligence company Cast is Rado Nikolov for his money, the open source licencing debate also has a security element in it.

Large organisations using open source code from GitHub, xs:code and other sources range from Walmart to NASA, collectively holding billions of pieces of sensitive data. Although open source code packages can be obtained at low or no cost, their various intellectual property and usage stipulations may lead to expensive legal implications if misunderstood or ignored, said Niklov.

Ilkka Turunen, global director of solutions architecture at DevSecOps automation company Sonatype further reminded us that there are 1001 ways of commercialising open source software but when releasing open source, the developer has a choice of publishing it under a license that is essentially a contract between them and the end user.

So theres security, theres fair and just contributions back to the community, theres layering over open for commercial use, theres the complexity of just so many open source licences existing out there to choose from and theres even concerns over whether trade sanctions can affect open source projects and see them becoming bifurcated along national borders.

Open source is supposed to be built around systems of meritocracy and be for the benefit of all, we must work hard to ensure that we can do this and shoulder the nuances of licensing to keep open source software as good as it should be let the debate continue.

More here:
The open source licence debate: what we need to know - Open Source Insider - ComputerWeekly.com

Kitware Offers Latest Innovations in Healthcare Simulation with Updates to Interactive Medical Simulation Toolkit and Pulse Physiology Engine -…

Clifton Park, NY, Jan. 17, 2020 (GLOBE NEWSWIRE) -- Kitware, a leader in open source software research and development, has released the latest versions of two of its popular medical training and simulation toolkits the Interactive Medical Simulation Toolkit (iMSTK) 2.0 and the Pulse Physiology Engine (Pulse) 2.3. Updates to these toolkits include improved models and functionality based on feedback from user and developer communities. Kitware will showcase these latest features and improvements at the International Meeting on Simulation in Healthcare (IMSH) in San Diego, January 18-22 at booth 912.

Both iMSTK and Pulse provide the technology to build virtual simulators that can help practicing surgeons, medical students, residents, and nurses to rehearse or plan medical procedures. For example, iMSTK has been used to help medical professionals prepare for biopsies, resectioning, radiosurgery, and laparoscopy without compromising patient safety in the operating room. It can also help accredit potential surgeons in basic skills for laparoscopy, endoscopy or robotic surgery. Pulse provides necessary physiologic feedback for clinicians training to provide life-saving medical treatment, such as for hemorrhage, tension pneumothorax, airway trauma, ventilator use and settings, and anaphylaxis.

"Kitware's medical computing team is dedicated to advancing research solutions in the medical community," said Andinet Enquobahrie, the director of medical computing at Kitware. "Whether we are collaborating with a university on research, working with our communities to improve our software platforms, or partnering with another company to integrate our software into their products and projects, our goal is to provide application developers the tools they need to develop powerful applications for medical skill training."

iMSTK 2.0 Improves Features, Efficiency of Physics, Collision and Rendering Modules

iMSTK is a free, open source toolkit that offers product developers and researchers all the software components they need to build and test virtual simulators for medical training and planning. Release 2.0 offers improved functionality with many new features as well as refactored modules that address the ease-of-use, and extendability of the API. Specifically, it has greatly improved the features as well as the efficiency of the physics, collision modules, and rendering modules.

Here are some release highlights:

Pulse 2.3 Improves Models and Functionality to Advance the Engine for Customer Needs

Pulse is a free, open source physiology engine that is used to rapidly prototype virtual simulation applications. These applications simulate whole-body human physiology through adult computational physiology models. Release 2.3 includes updates that were the result of Kitware's work with users to improve models and functionality of the engine.

Here are some release highlights:

For more information about iMSTK, visit the iMSTK website. For more information about Pulse, visit the newly redesigned Pulse website or sign up for the Pulse newsletter. To receive the latest updates on all of Kitware's software platforms, subscribe to our blog.

About Kitware

Since 1998, Kitware has been providing software research and development services to customers ranging from startups to Fortune 500 companies, including government and academic laboratories worldwide. Kitware's core areas of expertise are computer vision, data and analytics, high-performance computing and visualization, medical computing, and software process. The company has grown to more than 150 employees, with offices in Clifton Park, NY; Arlington, VA; Carrboro, NC; Santa Fe, NM; and Lyon, France. For more information visit kitware.com.

Read this article:
Kitware Offers Latest Innovations in Healthcare Simulation with Updates to Interactive Medical Simulation Toolkit and Pulse Physiology Engine -...

MongoDB: Riding The Data Wave – Seeking Alpha

MongoDB (MDB) is a database software company which is benefiting from the growth in unstructured data and leading the growth in non-relational databases. Despite MongoDB's recent rise in share price, its current valuation is modest given its strong position in a large and attractive market.

There has been an explosion in the growth of data in recent years with this growth being dominated by unstructured data. Unstructured data is currently growing at a rate of 26.8% annually compared to structured data which is growing at rate of 19.6% annually.

Figure 1: Growth in Data

(source: m-files)

Unstructured data refers to any data which despite possibly having internal structure is not structured via pre-defined data models or schema. Unstructured data includes formats like audio, video and social media postings and is often stored in non-relational database like NoSQL. Structured data is suitable for storage in a traditional database (rows and columns) and is normally stored in relational databases.

Mature analytics tools exist for structured data, but analytics tools for mining unstructured data are nascent. Improved data analytics tools for unstructured data will help to increase the value of this data and encourage companies to ensure they are collecting and storing as much of it as possible. Unstructured data analytics tools are designed to analyze information that doesn't have a pre-defined model and include tools like natural language processing.

Table 1: Structured Data Versus Unstructured Data

(source: Adapted by author from igneous)

Unstructured data is typically stored in NoSQL databases which can take a variety of forms, including:

Unstructured data can also be stored in multimodel databases which incorporate multiple database structures in the one package.

Figure 2: Multimodel Database

(source: Created by author)

Some of the potential advantages of NoSQL databases include:

Common use-cases for NoSQL databases include web-scale, IoT, mobile applications, DevOps, social networking, shopping carts and recommendation engines.

Relational databases have historically dominated the database market, but they were not built to handle the volume, variety and velocity of data being generated today nor were they built to take advantage of the commodity storage and processing power available today. Common applications of relational databases include ERP, CRM and ecommerce. Relational databases are tabular, highly dependent on pre-defined data definitions and usually scale vertically (a single server has to host the entire database to ensure acceptable performance). As a result, relational databases can be expensive, difficult to scale and have a relatively small number of failure points. The solution to support rapidly growing applications is to scale horizontally, by adding servers instead of concentrating more capacity in a single server. Organizations are now turning to scale-out architectures using open software technologies, commodity servers and cloud computing instead of large monolithic servers and storage infrastructure.

Figure 3: Data Structure and Database Type

(source: Created by author)

According to IDC, the worldwide database software market, which it refers to as structured data management software, was $44.6 billion in 2016 and is expected to grow to $61.3 billion in 2020, representing an 8% compound annual growth rate. Despite the rapid growth in unstructured data and the increasing importance of non-relational databases, IDC forecasts that relational databases will still account for 80% of the total operational database market in 2022.

Database management systems (DBMS) cloud services were 23.3% of the DBMS market in 2018, excluding DBMS licenses hosted in the cloud. In 2017 cloud DBMS accounted for 68% of the DBMS market growth with Amazon Web Services (AMZN) and Microsoft (MSFT) accounting for 75% of the growth.

MongoDB provides document databases using open source software and is one of the leading providers of NoSQL databases to address the requirements of unstructured data. MongoDB's software was downloaded 30 million times between 2009 and 2017 with 10 million downloads in 2017 and is frequently used for mobile apps, content management, real-time analytics and applications involving the Internet of Things, but can be a good choice for any application where there is no clear schema definition.

Figure 4: MongoDB downloads

(source: MongoDB)

MongoDB has a number of offerings, including:

Figure 5: MongoDB Platform

(source: MongoDB)

Functionality of the software includes:

MongoDB's platform offers high performance, horizontal scalability, flexible data schema and reliability through advanced security features and fault-tolerance. These features are helping to attract users of relational databases with approximately 30% of MongoDB's new business in 2017 resulting from the migration of applications from relational databases.

MongoDB generates revenue through term licenses and hosted as-a-service solutions. Most contracts are 1 year in length invoiced upfront with revenue recognized ratably over the term of the contract although a growing number of customers are entering multiyear subscriptions. Revenue from hosted as-a-service solutions is primarily generated on a usage basis and is billed either in arrears or paid up front. Services revenue is comprised of consulting and training services which generally result in losses and are primarily used to drive customer retention and expansion.

MongoDB's open source business model has allowed the company to scale rapidly and they now have over 16,800 customers, including half of the Global Fortune 100 in 2017. Their open source business model uses the community version as a pipeline for potential future subscribers and relies on customers converting to a paid model once they require premium support and tools.

Figure 6: Prominent MongoDB Customers

(source: Created by author using data from MongoDB)

MongoDB's growth is driven largely by its ability to expand revenue from existing customers. This is shown by the expansion of Annual Recurring Revenue (ARR) overtime, where ARR is defined as the subscription revenue contractually expected from customers over the following 12 months assuming no increases or reductions in their subscriptions. ARR excludes MongoDB Atlas, professional services and other self-service products. The fiscal year 2013 cohort increased their initial ARR from $5.3 million to $22.1 million in fiscal year 2017, representing a multiple of 4.1x.

Figure 7: MongoDB Cohort ARR

(source: MongoDB)

Although MongoDB continues to incur significant operating losses the contribution margin of new customers quickly becomes positive, indicating that as MongoDB's growth rate slows the company will become profitable. Contribution margin is defined as the ARR of subscription commitments from the customer cohort at the end of a period less the associated cost of subscription revenue and estimated allocated sales and marketing expense.

Figure 8: MongoDB 2015 Cohort Contribution Margin

(source: MongoDB)

MongoDB continues to achieve rapid revenue growth driven by an increasing number of customers and increased revenue per customer. Revenue growth has shown little sign of decline which is not surprising given the size of MongoDB's market opportunity. Revenue per customer is modest and MongoDB still has significant potential to expand the number of Global Fortune 100 customers.

Figure 9: MongoDB Revenue

(source: Created by author using data from MongoDB)

Figure 10: MongoDB Customer Numbers

(source: Created by author using data from MongoDB)

MongoDB's revenue growth has been higher than other listed database vendors since 2017 as a result of their expanding customer base and growing revenue per customer. The rise of cloud computing and non-relational databases has a large impact on relational database vendors with DBMS growth now dominated by cloud computing vendors and non-relational database vendors.

Figure 11: Database Vendor Revenue

(source: Created by author using data from company reports)

MongoDB's revenue growth is relatively high for its size when compared to other database vendors, but is likely to begin to decline in coming years.

Figure 12: Database Vendor Revenue Growth

(source: Created by author using data from company reports)

MongoDB's revenue is dominated by subscription revenue and this percentage has been increasing over time. This relatively stable source of income holds MongoDB in good stead for the future, particularly if customers can be converted to longer-term contracts.

Figure 13: MongoDB Subscription Revenue

(source: Created by author using data from MongoDB)

MongoDB generates reasonable gross profit margins for an enterprise software company from its subscription services, although these have begun to decline in recent periods. Likely as the result of the introduction of the entry level Atlas offering in 2016 and possibly also as a result of increasing competition.

Figure 14: MongoDB Gross Profit Margin

(source: Created by author using data from MongoDB)

MongoDB has exhibited a large amount of operating leverage in the past and is now approaching positive operating profitability. This is largely the result of declining sales and marketing and research and development costs relative to revenue. This trend is likely to continue as MongoDB expands, particularly as growth begins to decline and the burden of attracting new customers eases.

Figure 15: MongoDB Operating Profit Margin

(source: Created by author using data from MongoDB)

Figure 16: MongoDB Operating Expenses

(source: Created by author using data from MongoDB)

Although MongoDB's operating profitability is still negative it is in line with other database vendors and should become positive within the next few years. This is supported by the positive contribution margin of MongoDB's customers after their first year.

Figure 17: Database Vendor Operating Profit Margins

(source: Created by author using data from company reports)

MongoDB is yet to achieve consistently positive free cash flows, although appears to be on track as the business scales. This should be expected based on the high margin nature of the business and the low capital requirements. Current negative free cash flow is largely a result of expenditures in support of future growth in the form of sales and marketing and research and development.

Figure 18: MongoDB Free Cash Flow

(source: Created by author using data from MongoDB)

Competitors in the database vendor market can be broken into incumbents, cloud platforms and challengers. Incumbents are the current dominant players in the market, like Oracle (ORCL), who offer relational databases. Cloud platforms are cloud computing vendors like Amazon and Microsoft that also offer database software and services. Challengers are pure play database vendors who offer a range of non-relational database software and services.

Table 2: Database Vendors

(source: Created by author)

Incumbents

Incumbents offer proven technology with large set of features which may be important for mission critical transactional applications. This gives incumbents a strong position, particularly as relational databases are expected to continue to retain the lion's share of the database market in coming years. Incumbent players that lack a strong infrastructure-as-a-service platform though are poorly positioned to capture new applications and likely to be losers in the long run. This trend is evidenced by Teradata's (TDC) struggles since the advent of cloud computing and non-relational databases.

Cloud Platforms

Cloud service providers are able to offer a suite of SaaS solutions in addition to cloud computing, creating a compelling value proposition for customers. In exchange for reducing the number of vendors required and gaining access to applications designed to run together, database customers run the risk of being locked into a cloud vendor and paying significantly more for services which could potentially be inferior.

Challengers

Dedicated database vendors can offer best in breed technology, low costs and multi-cloud portability which helps to prevent cloud vendor lock-in.

The DBMS is typically broken into operational and analytical markets. The operational DBMS market refers to databases that are tied to a live application whereas the analytical market refers to the processing and analyzing of data imported from various sources.

Figure 19: Database Market Competitive Landscape

(source: Created by author)

Gartner assesses MongoDB as a challenger in the operational database systems market due primarily to a lack of completeness of vision. The leaders are generally large companies which offer a broader range of database types in addition to cloud computing services. MongoDB's ability to succeed against these companies will be dependent on them being able to offer best in class services and/or lower cost services.

View original post here:
MongoDB: Riding The Data Wave - Seeking Alpha

Qualys offers GPS guidance for developers at the application security crossroads – ComputerWeekly.com

All developers care deeply about application [development] security.

Okay, thats perhaps not always strictly true lets try again.

All developers care deeply about application functionality and speed, which they then carry through to a secondary level of concern related to Ops-level application manageability, flexibility and security.

How then should we engage with programmers on aspects of security, especially as it now straddles something of a crossroads brought about by the move to increasingly cloud-native cloud-first application development?

Security specialist Qualys [pronounced: KWAL-IS) has attempted to address the application development security subject head-on by hosting what probably ranks as the first tech event of 2020.

Qualys Security Conference London 2020 ran this week in London with the tagline: application security at a crossroads and isnt it just?

The company billed the event as an opportunity to explore the profound impact of digital transformation on the security industry and what it means for practitioners, partners and vendors.

Qualys is clearly focused on gaining attention from CIOs, CSOs and CTOs; but at ground level, the company says it works with network managers, cloud developers and security developers or, as they are known these days, DevSecOps practitioners.

So for developers then as we have noted before on the Computer Weekly Developer Network, the Qualys Web Application Scanning (WAS) 6.0 product now supports Swagger version 2.0 to allow programmers to streamline [security] assessments of REST APIs and get visibility of the security posture of mobile application backends and Internet of Things (IoT) services.

NOTE: Swagger is an open source software framework backed by a considerable ecosystem of tools that helps developers design, build, document and consume RESTful web services.

Qualys president and chief product officer Sumedh Thakar used his London keynote slot to deliver a piece he called The Evolution of the Qualys Platform: Unveiling the Latest Updates and Next-Gen Initiatives.

Speaking at the London show this January Thakar suggests that the process of digital transformation has moved from being a prototyping exploratory part of the business to, now in 2020, being something that IT development teams are truly rolling out.

Banks are now looking at technologies that would allow users to open an account simply by taking a selfie, said Thakar and so this will mean that these processes (which essentially run on applications) need to run on a secure backbone. The infrastructure that organisations will run on has become super-hybrid in order to be able to join all these new digital services together.

Cloud, containerisation and refactoring applications to be mobile friendly are just some of the major changes that need to happen in digitally disruptive environments.

Thakar is perhaps suggesting that if we can show developers that there are automated intelligence layers in place that will work across hybrid infrastructures and reduce the Mean Time To Remediation (MTTR), then developers might in fact take more interest in the security aspect of the systems they are working to engineer in the first place.

Thakar used a number of real world examples (from bank accounts that can be opened with nothing more than a selfie to intelligent motion-sensing doorbells) in an attempt to justify and validate the need for Qualys security technologies. With all examples tabled, Thakar led the audience forward to think about how system responses should be actioned.

He explained that the evolution of the Qualys platform has come about because SIEM, SOAR and log file analytics solutions (such as Splunk) were either never built to support a [security] data model that could be driven by Machine Learning (ML) or were not actually designed for security in the first place. and log file analytics is acting on historical data so it is very much after the event

NOTE: Security Information & Event Management - were always designed as log correlation specialists. Security Orchestration Automation & Response again was too much of a point solution (but which Qualys is adding as a function directly as a playbook anyway.)

As programmers design and evolve an image in the cloud, these developers will only need to make one single API call to bring Qualys security layers to bear upon their cloud native applications, due to the companys proximity to both Microsoft Azure and to Google Cloud Platform.

New (in terms of products) in 2020 is Qualys Respond, which includes an agent to deploy patches automatically to users devices so again, this allows applications to feature remediation controls more intuitively.

Other developer tools from the company include the ability to use Qualys Browser Recorder, a free Google Chrome browser extension, to review scripts for navigating through complex authentication and business workflows in web applications.

So then will developers ever truly embrace security issues and allow DevSecOps to put the Ops in operationalised?

Qualys would like to think so and engagement at the coal face along with an option to explain how complex authentication, the use of optimised security agents and streamlined security assessments/audits can be made easy dare we suggest almost joyful will (very arguably) ultimately really make a difference for developers.

See the original post here:
Qualys offers GPS guidance for developers at the application security crossroads - ComputerWeekly.com

Don’t want a robot stealing your job? Take a course on AI and machine learning. – Mashable

Just to let you know, if you buy something featured here, Mashable might earn an affiliate commission.There are some 288 lessons included in this online training course.

Image: pexels

By StackCommerceMashable Shopping2020-01-16 19:44:17 UTC

TL;DR: Jump into the world of AI with the Essential AI and Machine Learning Certification Training Bundle for $39.99, a 93% savings.

From facial recognition to self-driving vehicles, machine learning is taking over modern life as we know it. It may not be the flying cars and world-dominating robots we envisioned 2020 would hold, but it's still pretty futuristic and frightening. The good news is if you're one of the pros making these smart systems and machines, you're in good shape. And you can get your foot in the door by learning the basics with this Essential AI and Machine Learning Certification Training Bundle.

This training bundle provides four comprehensive courses introducing you to the world of artificial intelligence and machine learning. And right now, you can get the entire thing for just $39.99.

These courses cover natural language processing, computer vision, data visualization, and artificial intelligence basics, and will ultimately teach you to build machines that learn as they're fed human input. Through hands-on case studies, practice modules, and real-time projects, you'll delve into the world of intelligent systems and machines and get ahead of the robot revolution.

Here's what you can expect from each course:

Access 72 lectures and six hours of content exploring topics like convolutional neural networks (CNNs), recurrent neural networks (RNNs), and other deep architectures using TensorFlow. Ultimately, you'll build a foundation in both artificial intelligence, which is the concept in which machines develop the ability to simulate natural intelligence to carry out tasks, and machine learning, which is an application of AI aiming to learn from data and build on it to maximize performance.

Through seven hours of content, you'll learn how to arrange critical data in a visual format think graphs, charts, and pictograms. You'll also learn to deploy data visualization through Python using Matplotlib, a library that helps in viewing the data. Finally, you'll tackle actual geographical plotting using the Matplotlib extension called Basemap.

In just 5.5 hours, this course gives you a more in-depth look at the role of CNNs, the knowledge of transfer learning, object localization, object detection, and using TensorFlow. You'll also learn the challenges of working with real-world data and how to tackle them head-on.

Natural language processing (NLP) is a field of AI which allows machines to interpret and comprehend human language. Through 5.5 hours of content, you'll understand the processes involved in this field and learn how to build artificial intelligence for automation. The course itself provides an innovative methodology and sample exercises to help you dive deep into NLP.

Originally $656, you can slash 93% off and get a year's worth of access to the Essential AI and Machine Learning Bundle for just $39.99 right now.

Prices subject to change.

Read the original post:
Don't want a robot stealing your job? Take a course on AI and machine learning. - Mashable