Expert: Don’t overlook security in rush to adopt AI – The Winchester Star

MIDDLETOWN Lord Fairfax Community College hosted technologist Gary McGraw on Wednesday night. He spoke of the cutting edge work being done at the Berryville Institute of Machine Learning, which he co-founded a year ago.

The talk was part of the colleges Tech Bytes series of presentations by industry professionals connected to technology.

The Berryville Institute of Machine Learning is working to educate tech engineers and others about the risks they need to think about while building, adopting and designing machine learning systems. These systems involve computer programs called neural networks that learn to perform a task such as facial recognition by being trained on lots of data, such as by the use of pictures, McGraw said.

Its important that we dont take security for granted or overlook security in the rush to adopt AI everywhere, McGraw said.

One easily relatable adaptation of this technology is in smartphones, which are using AI to analyze conversations, photos and web searches, all to process peoples data, he said.

There should be privacy by default. There is not. They are collecting your data you are the product, he said.

The institute anticipates within a week or two releasing a report titled An Architectural Risk Analysis of Machine Learning Systems in which 78 risks in machine learning systems are identified.

McGraw told the audience that, while not interchangeable terms, artificial intelligence and machine learning have been sold as magic technology that will miraculously solve problems. He said that is wrong. The raw data used in machine learning can be manipulated and it can open up systems to risks, such as system attacks that could compromise information, even confidential information.

McGraw cited a few of those risks.

One risk is someone fooling a machine learning system by presenting malicious input of data that can cause a system to make a false prediction or categorization. Another risk is if an attacker can intentionally manipulate the data being used by a machine learning system, the entire system can be compromised.

One of the most often discussed risks is data confidentiality. McGraw said data protection is already difficult enough without machine learning. In machine learning, there is a unique challenge in protecting data because it is possible that through subtle means information contained in the machine learning model could be extracted.

LFCC Student Myra Diaz, who is studying computer science at the college, attended the program.

I like it. I am curious and so interested to see how can we get a computer to be judgmental in a positive way, such as judging what it is seeing, Diaz said.

Remaining speakers for this years Tech Bytes programs are:

6 p.m. Feb. 19: Kay Connelly, Informatics.

1 p.m. March 11:Retired Secretary of the Navy Richard Danzig

6 p.m. April 8: Heather Wilson, Analytics, L Brands

Read the rest here:
Expert: Don't overlook security in rush to adopt AI - The Winchester Star

Related Posts
This entry was posted in $1$s. Bookmark the permalink.