Computer security: ‘Melbourne Shuffle’ secures data in the cloud

To keep data safe in the cloud, a group of computer scientists suggests doing the Melbourne Shuffle.

That may sound like a dance move (and it is), but it's also a computer algorithm developed by researchers at Brown University.

The computing version of the Melbourne Shuffle aims to hide patterns that may emerge as users access data on cloud servers. Patterns of access could provide important information about a dataset -- information that users don't necessarily want others to know -- even if the data files themselves are encrypted.

"Encrypting data is an important security measure. However, privacy leaks can occur even when accessing encrypted data," said Olga Ohrimenko, lead author of a paper describing the algorithm. "The objective of our work is to provide a higher level of privacy guarantees, beyond what encryption alone can achieve."

The paper was presented this week at the International Colloquium on Automata, Languages, and Programming (ICALP 2014) in Copenhagen. Ohrimenko, who recently received her Ph.D. from Brown University and now works at Microsoft Research, co-authored the work with Roberto Tamassia and Eli Upfal, professors of computer science at Brown, and Michael Goodrich from the University of California-Irvine.

Cloud computing is increasing in popularity as more individuals use services like Google Drive and more companies outsource their data to companies like Amazon Web Services. As the amount of data on the cloud grows, so do concerns about keeping it secure. Most cloud service providers encrypt the data they store. Larger companies generally encrypt their own data before sending it to the cloud to protect it not only from hackers but also to keep cloud providers themselves from snooping around in it.

But while encryption renders data files unreadable, it can't hide patterns of data access. Those patterns can be a serious security issue. For example, a service provider -- or someone eavesdropping on that provider -- might be able to figure out that after accessing files at certain locations on the cloud server, a company tends to come out with a negative earnings report the following week. Eavesdroppers may have no idea what's in those particular files, but they know that it's correlated to negative earnings.

But that's not the only potential security issue.

"The pattern of accessing data could give away some information about what kind of computation we're performing or what kind of program we're running on the data," said Tamassia, chair of the Department of Computer Science.

Some programs have very particular ways in which they access data. By observing those patterns, someone might be able to deduce, for example, that a company seems to be running a program that processes bankruptcy proceedings.

See the original post here:
Computer security: 'Melbourne Shuffle' secures data in the cloud

MCUs, Memory Balance Security, Performance

TORONTO As the number of connected devices increases exponentially, so does the need for encryption. Thanks to the BYOD phenomenon, self-encrypted SSDs are finding their way into the enterprise to secure data regardless of operating system, while the Internet of Things is also driving the need for encryption, and in some cases it makes sense to do it at the micro-controller (MCU) level.

Adib Ghubril, research director at Gartner, said there are a number of benefits to encrypting data at the micro-controller level, including performance, power efficiency, and improved data protection. Since security is implemented at the hardware level its more difficult to hack, he said.

Of course, any application running a wireless interface benefits from encryption, Ghubril noted, including networked appliances such as smart meters or any intelligent IoT devices, and many of these devices are best enabled by MCUs, including encryption for their wireless payloads.

Microchip Technology recently expanded its line of eXtreme Low Power (XLP) PIC MCUs with the PIC24F GB2 family that includes an integrated hardware crypto engine, a random number generator and one-time-programmable key storage for protecting data in embedded applications.

Alexis Alcott, product marketing manager for Microchips MCU16 division, said the GB2 devices include up to 128 KB Flash and 8 KB RAM in small 28- or 44-pin packages, and are targeted at battery-operated or portable applications such as IoT sensor nodes, access control systems, and door locks. She said one of the chief concerns of customers is securing devices and data without hurting battery life, and many IoT devices are part of larger systems sharing data through Bluetooth or WiFi connectivity, which must be secure.

Wearables, including medical devices, are one of the fastest growing IoT segments for Microchip, said Alcott, and securing sensitive medical information, particularly from patients, is a chief concern. Another scenario she described was use of sensors to monitor humidity levels in a museum, which would turn on periodically to gather data and send it to central location. The device itself would not process the information, but it would have to be encrypted both at rest and while being transmitted, said Alcott. The recipient of the data must decrypt the data to read it.

Given the number of small devices that might be distributed, performing maintenance on the devices, including battery replacement, is costly and time consuming, and Alcott said providing encryption at the MCU level allows for more efficient power consumption, since less software overhead frees up CPU bandwidth and memory, and Microchips GB2 devices operate at a lower CPU frequency to save power.

Ghubril said Microchips offering is not particularly unique from an encryption perspective, but one of the most power conscious. Many vendors are offering MCUs with encryption features, he said, including Spansion, STMicro, NXP, Infineon, and Toshiba.

One of the chief concerns of users when adding features such as encryption to MCUs and SSDs is their effect on performance. A survey released by the Storage Networking Industry Association earlier this year found respondents had little interest in using built-in encryption features. Even though many SSDs being shipped today have data protection and encryption features built in, often those capabilities are not being switched on by OEMs, due to the misconception that encryption can reduce performance.

Meanwhile, the major SSD makers have been releasing updated self-encryption devices (SEDs). At the beginning of year, Samsung added new features to its 840 EVO SSD that work with third-party security software, while SanDisk announced in May early members of its ecosystem of ISVs for security management to support its recently announced X300 SSD, the companys first self-encrypting SSD based on TCG Opal 2.0 specifications.

View post:
MCUs, Memory Balance Security, Performance

Vertica’s open approach to Big Data management | #HPdiscover

An analytical database management software company, Vertica was acquired by Hewlett-Packard in 2011. Since then, Vertica has helped in the development of such software as HAVEn, useful for analyzing Big Data leveraging the Hadoop open source software.

Recently, Dave Vellante and Jeff Frick of theCUBE spoke with Chris Selland, Head Business Developer at HP Vertica, during the HP Discover Las Vegas 2014 event.

Vellante first asked Selland about the state of Big Data and its status in the marketplace. Selland responded by talking about how everyone is realizing just how important Big Data can be to their company. This has caused many companies to shift focus on how they handle their data.

The biggest problem that customers have when dealing with the massive amounts of data they generate on a daily basis is that they arent sure how to deal with all of the technology used to manage and assess the data. The desire is there, but the infrastructure within many companies cannot support the new data types.

All of this social data, machine data, Internet of things data; traditional EDWs were not built to handle any of this, Selland said. They cant scale, they dont perform, and theyre way too expensive. So, they want to get in on new technology, but they havent really figured out how yet.

When asked how Vertica approached data storage, especially in light of its use of Hadoop, Selland explained that the terminology the company has been using is store, explore, and serve. Based on this philosophy, the team at Vertica feel there is no good reason to throw away data.

According to Selland, though, Storing it and just sort of putting it in one place, particularly if you dont have it in a form that can be analyzed, isnt enough. Then you have to be able to explore it, look at it, see what Ive got, and figure out what I might be able to do with it.

View post:
Vertica’s open approach to Big Data management | #HPdiscover