By Using Automation, How Is Artificial Intelligence Benefiting The Fintech Industry In 2022? – Inventiva

By using automation, how is artificial intelligence benefiting the Fintech industry in 2022?

Right present, the fintech business is undergoing a considerable transformation. Customers are benefiting from the disruption by having more accessible access to credit, which has made payments and transactions more accessible than ever before. All of this is feasible because of technological advancements such as open banking and the rise of AI and Machine Learning.

Young India is credit-hungry, and its per capita expenditure has been steadily increasing. Customers used to have to go to the bank location, physically produce the appropriate paperwork, and wait at least 15 days for credit or a loan until recently. Banks used to take a long time to process documents, conduct KYC through human visits, assess credit risk, and finally authorize loans. Banks and lenders, on the other hand, may now lend money in a matter of hours rather than days. This has made the entire loan cycle shorter and more accessible to the average person.

The Fintech sector has undergone a complete transformation because of digitalization, open APIs, and machine learning integration. Lenders may process loan applications, conduct e-KYCs, and credit assessments, assess creditworthiness, and process loan amounts in only a few minutes. This has opened up a lot of options for people looking for financing. Every month, millions of customers apply for a loan, but only 10 to 15% of them are successful in completing the application procedure, and only 2 to 5% of those who apply are approved.

Both pre-processing and post-processing steps are affected by loan dropout. Filling up the application, receiving an offer, presenting KYC papers, providing account statements, income tax returns, and so on are all examples of pre-processing phases. Credit evaluation, credit determination, and loan distribution are the steps of post-processing. Several causes contribute to loan dropout at various stages: the client does not complete the application or is unable to supply the required papers, does not meet the risk score requirements, is price sensitive, and so on.

The loss of consumers along the loan application journey has grown costly for digital lending organizations as customer acquisition expenses have risen, resulting in a significant loan drop at every stage. This is where AI-driven intelligent automation technologies are assisting financial institutions in not only automating the entire process but also drastically lowering their costs and even helping customers in making educated decisions during their loan application journey.

Furthermore, it is a time-consuming procedure for lending organizations to complete all of their research while relying on the expertise of credit risk managers, credit policymakers, legal resources, and an entire team to analyze customer paperwork and still fail. Given the large number of applicants in this digital era, its hard to explore all the papers, assess the risk, determine credit worthiness, and make the best judgments possible while minimizing risk.

To address this problematic issue, AI and machine learning-based intelligent automation systems have been created and implemented to handle massive amounts of data, categorize anomalies, evaluate payment behaviour and patterns, assess credit worthiness, and automate risk choices. AI is enabling credit risk managers to gain a scientific understanding of each customers identity and risk behaviour, as well as give credit risk causation. AI is assisting lenders in predicting client loan dropout probabilities, which may aid in screening out qualified candidates, allowing the funnel to be optimized and ultimately lead to quality consumers being targeted and the entire application process being improved.

Following the completion of loan applications, the overarching AI model aids in predicting which customers are most likely to have their loan approved and establishing a pattern for suitable applications. This enables lenders to identify high-quality clients ahead of time and focus their efforts entirely on helping them boost conversion rates and reduce loan default rates.

Client acquisition costs are also significantly reduced as a result of this. The lender might also employ AI-powered intelligent automation to predict which customers are most likely to abandon their digital loan application at critical phases like avail offer, KYC, and document submission. The potent mix of AI and automation produces a one-of-a-kind customer service approach that also helps to prevent loan default. Using this data, the lender may now optimize targeted consumer advertising and call centre activity.

Adopting digital technology such as artificial intelligence in loan application administration can help banks reorganize the customer journey, increase efficiency, and free up people to provide value-added services.

Edited by Prakriti Arora

Like Loading...

Related

See original here:
By Using Automation, How Is Artificial Intelligence Benefiting The Fintech Industry In 2022? - Inventiva

Where Does Legal Accountability Rest Between Tesla’s Artificial Intelligence and Human Error? – Above the Law

Self-driving cars are nifty. Electric vehicles are cool. And when you think of self-driving electric cars, its hard not to think of Tesla. That said, not everyone associates them with safety. And with how the AIs algorithmic thinking is looking, they may have good reason.

On Thursday, the National Highway Traffic Safety Administration, an agency under the guidance of Transportation Secretary Pete Buttigieg, said it would be expanding a probe and look into830,000 Tesla carsacross all four current model lines, 11% more vehicles than they were previously examining.

Initially the probe started last year in response to Tesla vehicles mysteriously plowing into the scene of an existing accident where first responders were already present.

On Thursday, NHTSA said it had discovered in 16 separate instances when this occurred that Autopilot aborted vehicle control less than one second prior to the first impact, suggesting the driver was not prepared to assume full control over the vehicle.

CEO Elon Musk hasoften claimedthat accidents cannot be the fault of the company, as data it extracted invariably showed Autopilot was not active in the moment of the collision.

At least 26 crashes and 11 deaths appear to involve Teslas autopilot feature. While it is true that drivers should have their hands at 10 and 2 with their eyes on the road, youve gotta admit that there have been some representations of the autopilot feature as a replacement for human inputs. A last-minute shift from AI to UI is exactly the type of childish loopholing masquerading as brilliance youd expect from a guy with an Elden Ring build this bad.

Look, I know Ive made that gag in a prior article where I dunked on Musk for being goofy, BUT TWO MEDIUM SHIELDS?

For fear of being labeled a one-trick Tesla with weak windows this is exactly what youd expect from a guy who was already on trial for killing someone with a car.

Whats next? A special re-issue of O.J. Simpsons If I Did It with an additional chapter from Elon on how hed use tweets to manipulate stock prices?

Cartoonish evil gets satirical responses. In the meantime, it may be worth it to consider electric car alternatives that arent Teslas. And pay attention to the road, damn it.

Elon Musks Regulatory Woes Mount As U.S. Moves Closer To Recalling Teslas Self-Driving Software [Fortune]

Chris Williams became a social media manager and assistant editor for Above the Law in June 2021. Prior to joining the staff, he moonlighted as a minor Memelord in the Facebook groupLaw School Memes for Edgy T14s. He endured Missouri long enough to graduate from Washington University in St. Louis School of Law. He is a former boatbuilder who cannot swim,a published author on critical race theory, philosophy, and humor, and has a love for cycling that occasionally annoys his peers. You can reach him by email atcwilliams@abovethelaw.comand by tweet at@WritesForRent.

Here is the original post:
Where Does Legal Accountability Rest Between Tesla's Artificial Intelligence and Human Error? - Above the Law

Artificial intelligence tool predicts response to immunotherapy in lung and gynecologic cancer patients – EurekAlert

image:Anant Madabhushi view more

Credit: CWRU

CLEVELANDCollaboration between pharmaceutical companies and the Center for Computational Imaging and Personalized Diagnostics (CCIPD) at Case Western Reserve University has led to the development of artificial intelligence (AI) tools to benefit patients with non-small cell lung cancer (NSCLC) based on an analysis of routine tissue biopsy images, according to new research.

This year, more than 236,000 adults in the United States will be diagnosed with lung cancerabout 82% of them with non-small cell lung cancer, according to the American Society of Clinical Oncology.

Researchers at the CCIPD used AI to identify biomarkers from biopsy images for patients with NSCLC, as well as gynecologic cancers, that help predict the response to immunotherapy and clinical outcomes, including survival.

We have shown that the spatial interplay of features relating to the cancer nuclei and tumor-infiltrating lymphocytes drives a signal that allows us to identify which patients are going to respond to immunotherapy and which ones will not, said Anant Madabhushi, CCIPD director and Donnell Institute Professor of Biomedical Engineering at Case Western Reserve.

The study was published this month in the journal Science Advances.

Immunotherapy is expensive, and studies show that only 20-30% of patients respond to the treatment, according to National Institutes of Health and other sources. These findings validate that the AI technologies developed by the CCIPD can help clinicians determine how best to treat patients with NSCLC and gynecologic cancers, including cervical, endometrial and ovarian cancer, Madabhushi said.

The study, drawn from a retrospective analysis of data, also revealed new biomarker information regarding a protein called PD-L1 that helps prevent immune cells from attacking non-harmful cells in the body.

Patients with high PD-L1 often receive immunotherapy as part of their treatment for NSCLC, while patients with low PD-L1 are often not offered immunotherapy, or its coupled with chemotherapy.

Our work has identified a subset of patients with low PD-L1 who respond very well to immunotherapy and may not require immunotherapy plus chemotherapy, Madabhushi said. This could potentially help these patients avoid the toxicity associated with chemotherapy while also having a favorable response to immunotherapy.

The multi-site, multi-institutional study examined three common immunotherapy drugs (called checkpoint inhibitor agents) that target PD-L1: atezolizumab, nivolumab and pembrolizumab. The AI tools consistently predicted the response and clinical outcomes for all three immunotherapies.

The study is part of broader research conducted at CCIPD to develop and apply novel AI and machine-learning approaches to diagnose and predict the therapy response for various diseases and cancers, including breast, prostate, head and neck, brain, colorectal, gynecologic and skin.

The study coincides with Case Western Reserve recently signing a license agreement with Picture Health to commercialize AI tools to benefit patients with NSCLC and other cancers.

###

Case Western Reserve University is one of the country's leading private research institutions. Located in Cleveland, we offer a unique combination of forward-thinking educational opportunities in an inspiring cultural setting. Our leading-edge faculty engage in teaching and research in a collaborative, hands-on environment. Our nationally recognized programs include arts and sciences, dental medicine, engineering, law, management, medicine, nursing and social work. About 5,800 undergraduate and 6,300 graduate students comprise our student body. Visitcase.eduto see how Case Western Reserve thinks beyond the possible.

Spatial interplay patterns of cancer nuclei and tumor-infiltrating lymphocytes (TILs) predict clinical benefit for immune checkpoint inhibitors

1-Jun-2022

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Read the original here:
Artificial intelligence tool predicts response to immunotherapy in lung and gynecologic cancer patients - EurekAlert

Credentials for thousands of open source projects free for the takingagain! – Ars Technica

Getty Images

A service that helps open source developers write and test software is leaking thousands of authentication tokens and other security-sensitive secrets. Many of these leaks allow hackers to access the private accounts of developers on Github, Docker, AWS, and other code repositories, security experts said in a new report.

The tokens give anyone with access to them the ability to read or modify the code stored in repositories that distribute an untold number of ongoing software applications and code libraries. The ability to gain unauthorized access to such projects opens the possibility of supply chain attacks, in which threat actors tamper with malware before it's distributed to users. The attackers can leverage their ability to tamper with the app to target huge numbers of projects that rely on the app in production servers.

Despite this being a known security concern, the leaks have continued, researchers in the Nautilus team at the Aqua Security firm are reporting. A series of two batches of data the researchers accessed using the Travis CI programming interface yielded 4.28 million and 770 million logs from 2013 through May 2022. After sampling a small percentage of the data, the researchers found what they believe are 73,000 tokens, secrets, and various credentials.

"These access keys and credentials are linked to popular cloud service providers, including GitHub, AWS, and Docker Hub," Aqua Security said. "Attackers can use this sensitive data to initiate massive cyberattacks and to move laterally in the cloud. Anyone who has ever used Travis CI is potentially exposed, so we recommend rotating your keys immediately."

Travis CI is a provider of an increasingly common practice known as continuous integration. Often abbreviated as CI, it automates the process of building and testing each code change that has been committed. For every change, the code is regularly built, tested, and merged into a shared repository. Given the level of access CI needs to work properly, the environments usually store access tokens and other secrets that provide privileged access to sensitive parts inside the cloud account.

The access tokens found by Aqua Security involved private accounts of a wide range of repositories, including Github, AWS, and Docker.

Aqua Security

Examples of access tokens that were exposed include:

The following graph shows the breakdown:

Aqua Security

A representative for Code Climate, the service shown in the chart above, said the credentials found by Aqua Security don't provide hackers with unauthorized access. "These are Test coverage tokens, used to report test coverage to Code Climates Quality product," the representative said. "Unlike the other tokens mentioned in this post, these tokens are not considered secret, and cannot be used to access any data."

Aqua Security researchers added:

We found thousands of GitHub OAuth tokens. Its safe to assume that at least 10-20% of them are live. Especially those that were found in recent logs. We simulated in our cloud lab a lateral movement scenario, which is based on this initial access scenario:

1. Extraction of a GitHub OAuth token via exposed Travis CI logs.

2. Discovery of sensitive data (i.e., AWS access keys) in private code repositories using the exposed token.

3. Lateral movement attempts with the AWS access keys in AWS S3 bucket service.

4. Cloud storage object discovery via bucket enumeration.

5. Data exfiltration from the targets S3 to attackers S3.

Aqua Security

Travis CI representatives didn't immediately respond to an email seeking comment for this post. Given the recurring nature of this exposure, developers should proactively rotate access tokens and other credentials periodically. They should also regularly scan their code artifacts to ensure they don't contain credentials. Aqua Security has additional advice in its post.

Post updated to add comment from Code Climate.

Link:
Credentials for thousands of open source projects free for the takingagain! - Ars Technica

What are the Most Famous Programming Tools and Techniques? – Programming Insider

To sign up for our daily email newsletter, CLICK HERE

A programming tool, also known as a software development tool, is a program or application that programmers use to create, debug, maintain, and support other programs and applications. The word usually refers to a set of very simple programs that may be assembled to complete a task, similar to how many hand tools can be used to repair a real object. Its difficult to tell the difference between tools and applications. Simple databases (such as a file holding a list of significant values) are frequently used by developers as tools. A full-fledged database, on the other hand, is normally considered of as a separate application or piece of software. CASE (computer-assisted software engineering) tools have been in demand for a long time.

Successful tools have been difficult to come by. In certain ways, CASE tools, such as UML, prioritized design and architecture support. IDEs, on the other hand, have been the most successful of these tools. One of the characteristics of a professional software engineer is the ability to use a number of tools effectively. A program is a sequence of instructions that instructs the computer to do a variety of tasks; often, the instruction it is to perform is dependent on what happened after it completed a previous instruction. This section outlines the two major ways in which youll provide these instructions, or commands as theyre commonly known. One method employs an interpreter, while the other uses a compiler.

Software are very useful for manipulating and interpreting the concepts. Just like the Arduino that makes our life as easy as we can design multiple applications using it. If you want to control the speed and direction of DC motor of robotics car we can implement this task using Arduino.

Best Programming tools:

The most famous and useful programming tools are:

Every day, software developers are confronted with a large amount of information to remember. New technologies, keyboard shortcuts, software requirements, and best practices are all things to be aware of. Many of us reach a limit on how much we can keep in our thoughts at some point. Evernotes free tier gives you an external brain, a place where you may store learnings, articles, information, and keyboard shortcuts or commands. Its always there when you need it because its cloud-based.

Trello is a project management app that is both simple and free. Its an app that lets you make columns or swim lanes and arrange cards in them. These cards can represent jobs that need to be performed or labor that needs to be done.

GitHub created Atom, a relatively new code editor. Its open source and free, and it looks fantastic. Its also quite simple to use. Atom is a terrific tool for hacking at scripts or working on side projects, even if you use a more feature-rich IDE for your development at work. Atoms markdown preview mode is one feature that sets it apart from other code editors. When working on Readme files and other documentation, you can enter notes in markdown and get an inline preview.

Unity is a free, end-to-end game engine that makes it easier than ever to develop professional, cross-platform games. Its usual for software developers to dismiss game development as cool but too difficult, but with an infusion of high-quality tutorials and ongoing updates to Unitys tooling, the barrier to entry has never been lower. By dabbling in a totally different sort of programming, youll obtain insights and ideas that will help you become a better programmer overall, and youll probably have a lot of fun doing it.

Code Climate is a code analysis tool that rates your software based on test coverage, complexity, duplication, security, style, and other factors. It comes with a two-week trial period. Even if youre not willing to pay, Code Climate can provide you with a wealth of information on the code quality of your most recent personal project, orif your team is on boardthe product or service youre developing. You definitely have a sense for code smells as a software developer: things that could be better. When you have a lot of things wrong with your code, it might be difficult to know where to start.

See original here:
What are the Most Famous Programming Tools and Techniques? - Programming Insider

Snowflake is going big on one of the world’s most popular programming languages – TechRadar

Snowflake has announced plans to bring Python "to the forefront of its Data Cloud platform with upgrades that extend support for the programming language.

At its annual user conference, Snowflake Summit, the database company announced an expansion of its Snowpark developer framework that will give users easy access to a bounty of open source Python packages and libraries.

Now moving from private beta to public preview, Snowpark for Python promises to "improve programmability for data scientists, data engineers and app developers", Snowflake says.

Snowflake first introduced Snowpark in preview back in January 2021, before pushing the service to general availability earlier this year. Broadly, the objective was to give developers a simple and efficient way to program data in their language of choice.

"Our goal was to eliminate inefficient data pipelines and optimize processes and tasks that companies may be using just to get everyone on the same (data) page, said the firm, at the time of the launch.

Ultimately, Snowpark enables teams with different skill sets to collaborate and work on the same data, process data faster and more easily, and make data security and governance a top priority.

When it first went live, the Snowpark sandbox offered support for Java and Scala only, but the latest update now brings another of the world's most popular programming languages into the fray.

To supplement the rollout of Snowpark for Python, Snowflake also lifted the lid on a series of related upgrades that are currently under development. These include a native integration with Streamlit and other facilities designed to support the development and deployment of machine learning products written in Python.

Separately, the firm announced a private preview for a new service that will allow customers to access data stored in on-premise servers from within the Snowflake ecosystem, affording organizations the benefits of the cloud-based platform without the hassle of data migration.

"We are investing in Python to make it easier for data scientists, data engineers and application developers to build even more in the Data Cloud, without governance trade-offs," said Christian Kleinerman, SVP Product at Snowflake.

"Our latest innovations extend the value of our customers' data-driven ecosystems, enabling them with more access to data and new watts to develop with it in Snowflake. [These capabilities] are changing the way teams experiment, iterate and collaborate with data to drive value."

Disclaimer: Our flights and accommodation for Snowflake Summit 2022 were funded by Snowflake, but the organization had no editorial control over the content of this article.

See more here:
Snowflake is going big on one of the world's most popular programming languages - TechRadar

Top 7 ‘Hot’ Programming Languages of 2022 – ITPro Today

What's the most important programming language to learn in 2022? That's an open question, but one way to answer it is to look at languages that are currently trending.

Some of them are well-established coding languages that have long been popular. Others are newer languages that are just now entering their heyday. Either way, they're languages worth familiarizing yourself with.

Related: Is PHP Dying? No, but It Has an Image Problem

Here's a roundup of what are arguably the trendiest programming languages in 2022.

1. Python: When talking about hot programming languages in 2022, the list must start with Python. Probably no language is having a better year than Python, which recently slid into first place to become the very most popular language of all. You could argue that Python doesn't quite deserve that status, but the fact is that it enjoys it.

Related: COBOL Language Still in Demand as Application Modernization Efforts Take Hold

2. Go: Go (or Golang, as it's formally known) has long been a "cool" programming language partly because it traces its roots to Google (which is a hotbed of coolness, technologically speaking) and partly because it's fast to write, fast to compile, and fast to run.

3. OPA: Open Policy Agent, or OPA, isn't technically a programming language. It's a policy language that lets you define resources using code. That makes it a hot language, however, in a world increasingly obsessed with doing "everything as code."

4. Swift: If you develop anything for the world of Apple whether on macOS, iOS, or any other *OS platform Swift is a language you absolutely need to know today. It's also a relatively easy language to code in, by many accounts.

5. C: C, which turns 50 this year, may be old, but it remains relevant as ever and is still a hot programming language in 2022. It's messy, it's fast, and it's essential for a wide variety of programming tasks.

6. Java: It's arguably hard to get excited about Java a language that is tedious to code in and whose code is relatively slow. But the fact is that Java was the most popular programming language for years, and tons of stuff are still written in it. Whether you actually enjoy coding in Java or not, it remains an important language as of 2022.

7. JavaScript: JavaScript is not the same as Java, but they're similar in that tons of stuff are written in JavaScript, too. If you are creating web apps in particular, JavaScript is probably the most important language for you to learn today.

About the author

Read more:
Top 7 'Hot' Programming Languages of 2022 - ITPro Today

Mayor Bowser Breaks Ground on Modernization of Stead Park Recreation Center | mayormb – Executive Office of the Mayor

(Washington, DC) Today, Mayor Muriel Bowser and community members broke ground on the Stead Park Recreation Center in the Dupont Circle Historic District. The $15.4 million project will preserve the history of Stead Park while modernizing the grounds and expanding the facility to create more accessible, integrated spaces for exercise, play, and community engagement. The project will also deliver the first Net Zero Energy-Ready recreation space within the Department of Parks and Recreation (DPR) portfolio, project-managed by the Department of General Services (DGS). Families in DC love our parks and open spaces, and we love delivering spaces and facilities that meet the needs of our communities which is what we will do right here at Stead Park, said Mayor Bowser. These continued investments and improvements are why the District, for the past two years, has been recognized for having the best park system in the nation. The project consists of a 1.5-acre park and an existing historic carriage house named for Mary Force Stead as the primary building entry. Upon completion in 2023, the project will offer additional indoor recreational spaces, improved playgrounds and outdoor gathering spaces, and improved lighting. The recreation center will also have a solar canopy that includes a high-performing renewable energy system to offset all or most of its annual energy consumption, operating with a net zero energy consumption to save tax dollars.Starting in 2017 we have been engaged with the community on what the future of Stead Park will become, said DGS Director Keith A. Anderson. I am pleased that we are breaking ground on a project that has recreational elements for everyone and that will save on energy costs for the District.The Stead Park modernization will honor Mary Force Steads wish that the space be maintained for the perpetual use of the children of Washington, as noted from the carriage house plaque honoring her memory. The Friends of Stead Park donated $500,000 to support this project.We at DPR are incredibly excited about the modernization of the existing carriage house at Stead and the construction of an addition to the recreation center that will help foster community engagement and provide quality recreational programming in this vibrant neighborhood of Dupont Circle, said DPR Director Delano Hunter.Mayor Bowsers Fiscal Year 2023 Fair Shot Budget invests over $365 million over the next six years to improve parks and recreation facilities across the District. Additionally, the Mayor invested $13.5 million for Recreation for A.L.L. a new DPR initiative to expand recreation offerings and ensure all District residents, particularly young people, have access to high-quality recreational programming.To learn more, visit https://dgs.dc.gov/page/stead-park-recreation-center-project.

Social Media:Mayor Bowser Twitter:@MayorBowserMayor Bowser Instagram:@Mayor_BowserMayor Bowser Facebook:facebook.com/MayorMurielBowserMayor Bowser YouTube:https://www.bit.ly/eomvideos

Continue reading here:
Mayor Bowser Breaks Ground on Modernization of Stead Park Recreation Center | mayormb - Executive Office of the Mayor

Iterative Introduces First Machine Learning Experiment Tracking Extension for Microsoft Visual Studio Code – Business Wire

SAN FRANCISCO--(BUSINESS WIRE)--Iterative, the MLOps company dedicated to streamlining the workflow of data scientists and machine learning (ML) engineers, today announced a free extension to Visual Studio Code (VS Code), a source-code editor made by Microsoft, for experiment tracking and machine learning model development.

VS Code is a coding editor that helps users to start coding quickly in any programming language. The DVC Extension for Visual Studio Code allows users of all technical backgrounds to create, compare, visualize, and reproduce machine learning experiments. Through Git and Iteratives DVC, the extension makes experiments easily reproducible, unlike traditional experiment tracking tools that just stream metrics.

This is an open source VS Code extension for machine learning practitioners looking to accelerate their model development experience, said Ivan Shcheklein, co-founder and CTO of Iterative. It simplifies data scientists' machine learning model development workflows and meets ML modelers where they work. This extension eliminates the need for costly SaaS solutions for experiment tracking, turning VS Code into a native ML experimentation tool, built for developers.

The extension complements the existing VS Code UX with features using the Command Palette, Source Control view, File Tree explorer, and even custom in-editor webviews, to aid data scientists in their model development and experimentation workflows. Users can pull and push versioned data, run and reproduce experiments, and view tables and metrics.

"Beyond the tracking of ML models, metrics, and hyperparameters, this extension also makes ML experiments reproducible by tracking source code and data changes," said Dmitry Petrov, CEO of Iterative. Iteratives experiment versioning technology that was implemented in DVC last year makes this reproducibility possible."

The VS Code extension offers data scientists the ability to view, run, and instantly reproduce experiments with parameters, metrics, and plots all in a single place, as well as manage and version data sets and models. The extension also provides resource tracking so that data scientists can see which data sets and models have changed and allows exploration of all files of a project or model. Other features include live tracking to see how metrics change in real-time, cloud-agnostic data versioning and management, and native plot visualization.

The VS Code extension helps organizations:

DVC, the underlying open-source technology behind the extension, brings agility, reproducibility, and collaboration into the existing data science workflow. It provides users with a Git-like interface for versioning data and models, bringing version control to machine learning and solving the challenges of reproducibility. DVC is built on top of Git and creates lightweight metafiles, which enable the data science and ML teams to efficiently handle large files that otherwise cant be stored.

To learn more about the VS Code extension, check out the blog and get started today.

About Iterative

Iterative.ai, the company behind Iterative Studio and popular open-source tools DVC, CML, and MLEM, enables data science teams to build models faster and collaborate better with data-centric machine learning tools. Iteratives developer-first approach to MLOps delivers model reproducibility, governance, and automation across the ML lifecycle, all integrated tightly with software development workflows. Iterative is a remote-first company, backed by True Ventures, Afore Capital, and 468 Capital. For more information, visit Iterative.ai.

Link:
Iterative Introduces First Machine Learning Experiment Tracking Extension for Microsoft Visual Studio Code - Business Wire

Particles: The Beginning of a New Age in Decentralization – PR Newswire

CINCINNATI and BERLIN, June 16, 2022 /PRNewswire/ -- Pre-seed round tech startup Anomaly Scienceannounced its patent-pending business proposal for "Particles" - a method for the management and fiscal sponsorship of open-source projects, Web3 initiatives, decentralized projects, and tokenization of intellectual property rights through the creation and utilization of smart contracts and blockchain technologies.

"Anomaly Science is headquartered in Cincinnati, Ohio," explained 19-year-old CEO Jacob Haap, "But I knew we needed to be ready to do business in multiple countries. That's why I moved to the European Union."

Haap's vision is to lower the entry-level to computer programming, and potentially other types of business, for people who have ideas but no corporate structure for support.

"(Jacob Haap) is going to make it so that everybody, anybody, can create any kind of breakthrough software with a variety of different programmers who are all going to work independently, and together, at the same time,"said American venture capitalist Tim Draper. "And it's going to be really awesome because all of those programmers (won't) have to deal with the SEC and all that other stuff because they are all under one corporate roof, but they can operate as entrepreneurs. They can have ownership in what they do. I think if you're a programmer, you should talk to Jacob."

Haap is a 2021 graduate of Draper's Hero Training program through Draper University, one of the top pre-accelerator programs in the world.

"I didn't know what to expect when I enrolled forDraper's Hero program," says Haap. This year's cohortfeatured over 80 entrepreneurs from an original applicant pool of over 2,000. The average age of the participants was 29. "As someone who just graduated from high school, I originally felt different from people who had been actively working in the business for years. But very quickly, I realized I was onto an idea with real potential."

During an event called Demo Day, over 80 entrepreneursgathered in San Mateo, California to give pitches to an audience of investors. A panel of judges ranked Haap's presentation third place, adding even more traction to his startup. Since then, Haap took his initial investment, continued to develop his business plan, and built proof-of-concept interfaces for linking crypto walletswith his hybrid source-code repository. "We are ready to take the next step with an investor who sees the amazing potential in the model we have built around Particles," explained Haap.

"Particles open a new door," continued Haap, "enabling greater collaboration and sharing of work, creating knowledge pools and hybrid-source codebases, and a means to do business in our growingly decentralized world that can function across international borders nearly seamlessly."

Anomaly Science is currently trying to start seed round funding with a goal of 500k. This is an opportunity for investors to become part of a transformational approach to doing business.

About Anomaly Science

Anomaly Science is a company building the bridge to Web3, making it easier for decentralized projects to take flight, lowering the level of entry to software development, and giving power back to software developers.

To learn more about Anomaly Science, you can reach us via email:[emailprotected]or visit our websiteanomsci.comto join the waitlist.

Media ContactJacob Haap[emailprotected]+49 175 2954140

SOURCE Anomaly Science

See original here:
Particles: The Beginning of a New Age in Decentralization - PR Newswire