Google May Push Sites to Use Encryption

A powerful voice at Google wants websites to be more secure.

In a move that experts say could make it harder to spy on Web users, Google is considering giving a boost in its search-engine results to websites that use encryption, the engineer in charge of fighting spam in search results hinted at a recent conference.

The executive, Matt Cutts, is well known in the search world as the liaison between Googles search team and website designers who track every tweak to its search algorithms.

Cutts also has spoken in private conversations of Googles interest in making the change, according to a person familiar with the matter. The person says Googles internal discussions about encryption are still at an early stage and any change wouldnt happen soon.

A Google spokesman said the company has nothing to announce at this time.

Encrypting data transmitted over the Internet adds a barrier between web users and anyone that wants to snoop on their Internet activities, or steal their information.

Google uses its search algorithm to encourage and discourage practices among web developers. Sites known to have malicious software are penalized in rankings as are those that load very slowly, for instance. In total, the company has over 200 signals that help it determine search rankings, most of which it doesnt discuss publicly.

If Google adds encryption to the list, it would give websites a big incentive to adopt it more widely.

This would be a wonderful thing, says Kevin Mahaffey, chief technology officer at mobile-security company Lookout. He says encryption assures that a users data cant be seen by others while moving across the Internet, that it cant be tampered with, and that it gets to the correct recipient.

Of course, that assumes that the encryption works. Internet users were jolted this week by disclosures that a popular encryption scheme, known as OpenSSL, contained a bug that could allow hackers to steal personal information.

Read this article:
Google May Push Sites to Use Encryption

Google said to be eyeing a boost to encrypted sites in search results

Websites that use encryption could be elevated in Google search results sometime in the future, according to The Wall Street Journal.

The algorithm change was hinted at by Matt Cutts, a top Google engineer, at the SMX West marketing conference last month the report said. Cutts is in charge of combating spam in search results and acts as the liaison between Googles search team and website designers who track changes made to Googles search algorithms.

Some early-stage internal discussions at Google have also taken place on incorporating encryption into Googles search algorithm rankings, the report said. The move would add a layer of security for Web users, while also giving companies an incentive to prioritize encrypting data.

We have nothing to announce at this time, a Google spokeswoman told the IDG News Service.

Googles algorithms incorporate a range of signals that determine search-result prominence. Placing encrypted sites higher in the mix would serve as a signal of its own on the importance of security, amid concerns over cyberattacks and government surveillance.

Google has used HTTPS encryption, which is designed to cloak traffic flowing between its data centers and users, for services such as Gmail and Search for some time now.

The elevated rankings could help drive more people to those sites and keep their data secure, if the encryption is effective. Flaws recently discovered in the OpenSSL protocol, a major encryption method across the Web, have shown that some established security safeguards are not rock solid.

Zach Miners covers social networking, search and general technology news for IDG News Service More by Zach Miners

Read the original:
Google said to be eyeing a boost to encrypted sites in search results

Open Source Software Is the Worst Kind Except for All of the Others

Heartbleed, for anyone who doesn't read the papers, is a serious bug in the popular OpenSSL security library. Its effects are particularly bad, because OpenSSL is so popular, used to implement the secure bit of https: secure web sites on many of the most popular web servers such as apache, nginx, and lighttpd.

A few people have suggested that the problem is that OpenSSL is open source, and code this important should be left to trained professionals. They're wrong. The problem is that writing and testing cryptographic software is really, really hard.

Writing and testing any sort of security software is hard, because the goals are more or less the opposite of normal software. For normal software, the main goal is to do the right thing with correct input. If it's a word processor or spreadsheet, you want it to compute and display the right results with reasonable input, but you don't much care what happens with unreasonable input. If you tell a spreadsheet to open a file full of garbage, and you get a strange screen display or the program crashes, that is at worst mildly annoying.

With security software, though, the entire value is to make sure that it rejects every incorrect input, and doesn't erroneously reveal secure material. This distinction is not one that is well understood in the computer industry. I can recall far too many reviews of desktop file encryption programs where the reviewer went on at great length about the speed of encryption and decryption, the ease of use of the various screen displays, but never bothered to check that it rejected attempts to decode data with the wrong password. Since the number of possible invalid inputs is stupendously greater than the number of valid inputs, ensuring that security software does what it is supposed to do presents a severe debugging and testing problem.

Public key (PK) cryptography, the core functions for which everyone uses OpenSSL, is doubly difficult because the programming is really tricky. All PK algorithms depend on mathematical operations which are relatively easy to do, but very difficult to reverse. A well-known example is multiplying two prime numbers to get their product vs. finding the two primes if you only know the product. All current algorithms involve arithmetic on very large numbers, much larger than any computer can handling without using special arithmetic libraries.

Even the relatively easy PK operations are still pretty slow, so the practical way to use PK is to use conventional shared key cryptography to protect the web page or mail message or whatever, and only use the PK for one end to tell the other the shared key to use. Cryptography has advanced a lot in the decade that OpenSSL has been in use, with both new PK cryptographic algorithms and shared key algorithms, so there are now about two dozen combinations of initial PK and session shared key schemes that OpenSSL has to support.

People have added extra complication to try to make things faster; one trick is to note that if you fetch a secure web page from a server, you'll probably fetch other stuff from the same server, so with the agreement of both ends they can leave the session open after the page is complete, and reuse the same session and same shared key on subsequent requests. The heartbleed bug is in "heartbeat" code that periodically checks to see if the other end of an open session is still there. (One way to fix heartbleed is just to turn off the session saving feature, in which case everything will still work, just slower since there will be more sessions to to create.)

As if all this weren't complex enough, public keys by themselves are just large pseudo-random numbers, which have to be securely associated with domain names for web servers, or e-mail addresses for S/MIME mail, and there's a whole Public Key Infrastructure (PKI) in which well known entities can assert that a particular key belongs to a particular name using digital signatures, essentially PK encryption run backwards. The package of key, name, and a bunch of other stuff is known as a certificate, which is encoded using a system called ASN.1. The encodings and options in ASN.1 are so complicated that it is notoriously difficult to implement correctly, and has led to multiple security issues just from encoding errors.

Any software package that does what OpenSSL does has to do all the stuff I described above. OpenSSL has a few issues of its own. One is its history; it evolved from an earlier package in the 1990s called SSLeay which was apparently originally an experimental implementation of the large number arithmetic needed for PK cryptography. SSLeay turned into OpenSSL in 1998, so there is now close to 20 years of evolutionary cruft in the half million lines of OpenSSL code. It is written in the C programming language, which remains the lingua franca of the software community, in that no matter what language your application is written in, there's always a way for that language to connect to and use C libraries.

C grew up in the 1970s on small computers, notably 16-bit PDP-11s, where every bit of code and data space was precious, so it doesn't have much in the way of defensive programming features to detect and prevent buffer overruns and other bugs. Modern C applications can use libraries that provide much of this defensive programming, but rewriting old C code to be defensive is tedious, and doing so without introducing new bugs is hard, so people rarely do.

Follow this link:
Open Source Software Is the Worst Kind Except for All of the Others

Heartbleed: Open source’s worst hour

Summary: People assumed that open source software is somehow magical, that it's immune to ordinary programming mistakes and security blunders. It's not.

Heartbleed was open source software'sbiggest failure to date. A simple OpenSSL programming mistake opened a security hole in a program that affected hundreds of millions of websites, and God alone knows how many users, who relied upon it for their fundamental security.

We know what happened. A programming blunder enabled attackers to pull down 64k chunks of "secure" server memory. Of course, a hacker would then have to sift through this captured memory for social security numbers, credit-card numbers, and names, but that's trivial.

We know how it happened. German programmer Dr. Robin Seggelmann added a new "feature" and forgot to validate a variable containing a length. The code reviewer, Dr Stephen Henson, "apparently also didnt notice the missing validation," said Seggelmann, "so the error made its way from the development branch into the released version." And, then for about two years the defective code would be used, at one time or another, by almost ever Internet user in the world.

Sorry, there was no grand National Secuity Agency (NSA) plan to spy on the world. It was just a trivial mistake with enormous potential consequences.

So why did this happen? Simple everyone makes mistakes. Estimates on the number of errors per lines of code (LOC) ranges from 15 to 50 errors per LOC to three if the code is rigorously checked and tested. OpenSSL has approximately 300-thousand LOC. Thinks about it.

Still, open source programming methodology is supposed to catch this kind of thing. By bringingmany eyeballs to programs a fundamental open source principle it's believed more errors will be caught. It didn't work here.

This mistake, while not quite as much a beginner's blunder as Apple's GOTO fiasco, was the kind of simple-minded mistake that any developer might make if tired, and that anyone who knows their way around the language should have spotted.

So why didn't they? Was it because OpenSSL is underfunded and doesn't have enough programmers?

Was it because, as Poul-Henning Kamp, a major FreeBSD and security developer, put it, "OpenSSL sucks. The code is a mess, the documentation is misleading, and the defaults are deceptive. Plus it's 300,000 lines of code that suffer from just about every software engineering ailment you can imagine."

Link:
Heartbleed: Open source's worst hour

Did open source matter for Heartbleed?

Summary: Open source does not provide a meaningful inherent security benefit for OpenSSL and it may actually discourage some important testing techniques. Also, panhandling is not a good business model for important software like OpenSSL.

The ugly episode of Heartbleed has put OpenSSL under more scrutiny than any open source software project ever. At a certain level of scrutiny perhaps any program will look bad, but OpenSSL's on the hot seat because it's OpenSSL that failed in its mission. It's hard to construe these matters in a way that makes OpenSSL or the open source nature of it look good.

But who is this "OpenSSL"? When something goes wrong with a product people want to know who is responsible. Many will be shocked to learn that it's all run by a small group of developers,most volunteers and all but one part-time. Huge parts of the Internet, multi-zillion dollar businesses, implicitly trust the work these people do. Why?

Let's stipulate that OpenSSL has a good reputation, perhaps even that it deserves that reputation (although this is not the first highly-critical vulnerability in OpenSSL). I would argue that the reputation is based largely on wishful thinking and open source mythology.

Before the word "mythology" gets me into too much trouble, I ought to say, as Nixon might have put it, "we're all open source activists now." For some purposes, open source is a good thing, or a necessary thing, or both. I agree, at least in part, with those who say that cryptography code needs to be open source, because it requires a high level of trust.

Ultimately, the logic of that last statement presumes that there are people analyzing the open source code of OpenSSL in order to confirm that it is deserving of trust. This isthe "many eyeballs" effect described in The Cathedral and the Bazaar, by Eric Raymond, one of the early gospels in the theology of open source. The idea is that if enough people have access to source code then someone will notice the bugs.

This is, in fact, what has happened with Heartbleed... sort of. Heartbleed was discovered byNeel Mehta, a security researcher at Google. If you look at the vulnerability disclosures coming out of other companies, Apple and Microsoft for example, you can see that Google spends a lot of time scrutinizing other people's programs. They're like no other group in this regard.

But it took Google two yearsto find it. In the meantime, Google finds lots of security problems in Apple and Microsoft products for which they have no source code. This is because in the time since the formation of the "many eyeballs" hypothesis, there have been huge improvements in testing and debugging tools. Some computer time with a marginal cost of $0 is worth thousands of very expensive eyeballs.

I'd go so far as to suspect that the availability of source makes developers and users discount the necessity of testing that is common on commercial software. I wouldn't be surprised if a static source code analyzer would have found the Heartbleed bug, flagging it for possible buffer over/underrun issues. Heartbleed might also have been found by a good round of fuzzing.

As I said recently, some programs are so critical to society at large thatsomeone needs to step in and make sure they are properly secured. Obviously the problem is money. So why, when this program is so critical, is itbeing run like it's public TV? Yes,like Blanche DuBois, OpenSSL has always depended on the kindness of strangers.

Follow this link:
Did open source matter for Heartbleed?

Portal designed to help students plan career

A city-based open source software development company has designed and developed a broad based portal to help the students plan their career effectively with the support of the colleges and their parents.

The portal http://www.altocarrera.com is designed to bridge the knowledge gap between the requirements of the industry and the available resources, CEO of NeelSys India Pvt Ltd, Srinivas Balasadi said in a telecon from the USA. The companies can look at the macro metrics of the academic performance of the colleges and its students. The big data analysis of the performance of students across different disciplines and colleges would help the industry also design its training programmes. On the other hand the colleges and students can look at the industry requirements and select a career path dynamically. The colleges would be able to fine tune their courses to suit the need of the industry, he explained.

The company has chosen to develop the platform using open source technologies to leverage the advantages of the solutions available in the public domain apart from enabling a comprehensive customisation, Mohan of NeelSys India explained. Further, using open source ensures that the portal remains free for all the users, he added. This makes NeelSys one of the few software companies in the city that are into product development.

See original here:
Portal designed to help students plan career