Black Duck Raises $20M To Advance Leadership In Open Source Software Logistics

Black Duck Software, an OSS Logistics solutions provider enabling the deployment and management of open source software (OSS), today announced that it has closed a $20 million investment round led by General Catalyst Venture Partners with all existing investors also participating in the round. The funding will be used to help the company expand its global go-to-market model to fulfill the rapidly growing OSS Logistics market opportunity. In addition, the company today announced the appointment of Stephen Gregorio as its Chief Financial Officer and Executive Vice President.

Gregorio, who played a critical role in securing the new funding, has over 25 years of financial management experience at high-growth technology companies including Verdasys, Interwise Corporation (acquired by AT&T), and Gensym Corporation, among others. Reporting directly to Black Duck President and CEO Lou Shipley, Gregorio will help Black Duck expand aggressively in the fast-growing OSS Logistics sector.

With this funding in place, we are well-positioned to advance OSS Logistics solutions across the enterprise, said Shipley. Over the next 18 months, we will release powerful new solutions aimed at solving critical supply chain and software development challenges that have resulted from the explosive growth of OSS in enterprises worldwide. Black Duck is among the first to recognize the need for a smarter, more efficient approach to streamlining, safeguarding, and managing the software development and deployment chain. As such, we are ready to help the worlds most innovative organizations better leverage, secure, and grow their investments in open source to achieve greater business success.

According to Gartner, a leading market research firm, 95 percent of all IT organizations will leverage non-trivial elements of OSS technology in their mission-critical IT portfolios by 2016, and fewer than 50 percent of organizations will have implemented an effective strategy for procuring and managing OSS. Black Duck has a rich history of helping Fortune 1000 firms dramatically improve software quality, hasten application development lifecycles, and improve compliance while mitigating security risks.

Black Ducks Board of Directors and investment advisors are equally optimistic about the companys future in this emerging market.

Weve partnered with Black Duck since it was founded, and over the past decade the company has successfully evangelized and supported the safe and proper use of open source software for thousands of enterprises worldwide. Now that OSS has matured and become ubiquitous across organizations of all sizes, they are ready to drive the next phase of explosive innovation and growth in the market, said Larry Bohn, Managing Director at General Catalyst Partners.Black Duck provides the only platform that enables enterprises to manage the increasingly complex OSS logistics frontier.Its a great example of a company weve been bullish on for a long time that maintained its focus while the market was catching up to its vision. Its future looks really exciting.

Further accelerating Black Ducks growth will be Gregorios proven financial management experience. In his previous positions, he successfully managed IPOs, handled merger and acquisition transactions on both sides, raised both debt and equity capital, and served as general counsel, negotiating customer, partnership, and channel agreements.

See more here:
Black Duck Raises $20M To Advance Leadership In Open Source Software Logistics

Time Travel Simulation Resolves “Grandfather Paradox”

What would happen to you if you went back in time and killed your grandfather? A model using photons reveals that quantum mechanics can solve the quandaryand even foil quantum cryptography

Entering a closed timelike curve tomorrow means you could end up at today. Credit:Dmitry Schidlovsky

On June 28, 2009, the world-famous physicist Stephen Hawking threw a party at the University of Cambridge, complete with balloons, hors d'oeuvres and iced champagne. Everyone was invited but no one showed up. Hawking had expected as much, because he only sent out invitations after his party had concluded. It was, he said, "a welcome reception for future time travelers," a tongue-in-cheek experiment to reinforce his 1992 conjecture that travel into the past is effectively impossible.

But Hawking may be on the wrong side of history. Recent experiments offer tentative support for time travel's feasibilityat least from a mathematical perspective. The study cuts to the core of our understanding of the universe, and the resolution of the possibility of time travel, far from being a topic worthy only of science fiction, would have profound implications for fundamental physics as well as for practical applications such as quantum cryptography and computing.

Closed timelike curves The source of time travel speculation lies in the fact that our best physical theories seem to contain no prohibitions on traveling backward through time. The feat should be possible based on Einstein's theory of general relativity, which describes gravity as the warping of spacetime by energy and matter. An extremely powerful gravitational field, such as that produced by a spinning black hole, could in principle profoundly warp the fabric of existence so that spacetime bends back on itself. This would create a "closed timelike curve," or CTC, a loop that could be traversed to travel back in time.

Hawking and many other physicists find CTCs abhorrent, because any macroscopic object traveling through one would inevitably create paradoxes where cause and effect break down. In a model proposed by the theorist David Deutsch in 1991, however, the paradoxes created by CTCs could be avoided at the quantum scale because of the behavior of fundamental particles, which follow only the fuzzy rules of probability rather than strict determinism. "It's intriguing that you've got general relativity predicting these paradoxes, but then you consider them in quantum mechanical terms and the paradoxes go away," says University of Queensland physicist Tim Ralph. "It makes you wonder whether this is important in terms of formulating a theory that unifies general relativity with quantum mechanics."

Experimenting with a curve Recently Ralph and his PhD student Martin Ringbauer led a team that experimentally simulated Deutsch's model of CTCs for the very first time, testing and confirming many aspects of the two-decades-old theory. Their findings are published in Nature Communications. Much of their simulation revolved around investigating how Deutsch's model deals with the grandfather paradox, a hypothetical scenario in which someone uses a CTC to travel back through time to murder her own grandfather, thus preventing her own later birth. (Scientific American is part of Nature Publishing Group.)

Deutsch's quantum solution to the grandfather paradox works something like this:

Instead of a human being traversing a CTC to kill her ancestor, imagine that a fundamental particle goes back in time to flip a switch on the particle-generating machine that created it. If the particle flips the switch, the machine emits a particlethe particleback into the CTC; if the switch isn't flipped, the machine emits nothing. In this scenario there is no a priori deterministic certainty to the particle's emission, only a distribution of probabilities. Deutsch's insight was to postulate self-consistency in the quantum realm, to insist that any particle entering one end of a CTC must emerge at the other end with identical properties. Therefore, a particle emitted by the machine with a probability of one half would enter the CTC and come out the other end to flip the switch with a probability of one half, imbuing itself at birth with a probability of one half of going back to flip the switch. If the particle were a person, she would be born with a one-half probability of killing her grandfather, giving her grandfather a one-half probability of escaping death at her handsgood enough in probabilistic terms to close the causative loop and escape the paradox. Strange though it may be, this solution is in keeping with the known laws of quantum mechanics.

In their new simulation Ralph, Ringbauer and their colleagues studied Deutsch's model using interactions between pairs of polarized photons within a quantum system that they argue is mathematically equivalent to a single photon traversing a CTC. "We encode their polarization so that the second one acts as kind of a past incarnation of the first, Ringbauer says. So instead of sending a person through a time loop, they created a stunt double of the person and ran him through a time-loop simulator to see if the doppelganger emerging from a CTC exactly resembled the original person as he was in that moment in the past.

See the original post:
Time Travel Simulation Resolves “Grandfather Paradox”

The Future of Security: Zeroing In On Un-Hackable Data With Quantum Key Distribution

Thieves steal data constantly, so protecting it is an ongoing challenge. There are more than 6,000 banks with 80,000 branches in the United States, nearly 6,000 hospitals and thousands of insurance companies, all with data that we want to be kept private. Traditionally, their valued data is protected by keys, which are transmitted between sender and receiver. These secret keys are protected by unproven mathematical assumptions and can be intercepted, corrupted and exposed if a hacker eavesdrops on these keys during transmission. Specific problems with current encryption technology include:

Standard methods for exchanging cryptographic keys are in jeopardy. RSA-1024, once commonly used to exchange keys between browsers and web servers, has probably been broken; its no longer regarded as safe by NIST, though RSA-2048 is still approved. This and other public-key infrastructure technologies perhaps havent been broken yet but soon will be by bigger, faster computers. And once quantum computers are mainstream, data encrypted using existing key exchange technologies will become even more vulnerable.

Researchers are working on methods to improve the security of software-based key exchange methods using what is known aspost-quantum cryptography methods that will continue to be effective after quantum computers are powerful enough to break existing key exchange methods. These are all based on the unprovable assertion that certain numerical algorithms are difficult to reverse. But the question that remains is difficult for whom? How do we know that an unpublished solution to these exact problems hasnt been discovered? The answer is we dont.

Quantum cryptography is the only known method for transmitting a secret key over long distances that is provably secure in accordance with the well-accepted and many-times-verified laws that govern quantum physics. It works by using photons of light to physically transfer a shared secret between two entities. While these photons might be intercepted by an eavesdropper, they cant be copied, or at least, cant be perfectly copied (cloned). By comparing measurements of the properties of a fraction of these photons, its possible to show that no eavesdropper is listening in and that the keys are thus safe to use; this is what we mean by provably secure. Though called quantum cryptography, we are actually only exchanging encryption keys, so researchers prefer the term quantum key distribution, or QKD, to describe this process.The no-cloning theorem is one of the fundamental principles behind QKD, and why we think that this technology will become a cornerstone of network security for high value data.

While products based on QKD already are being used by banks and governments in Europe especially Switzerland they have not been deployed commercially in the United States to any great extent. Current technological breakthroughs are pushing the distance over which quantum signals can be sent.Trials using laboratory-grade hardware and dark fibers optical fibers laid down by telecommunications companies but lying unused have sent quantum signals three hundred kilometers, but practical systems are currently limited to distances of about 100 kilometers. A scalable architecture that includes a Trusted Node to bridge the gap between successive QKD systems can both extend the practical range of this technology and allow keys to be securely shared over a wide ranging network, making large scale implementation possible and practical. Cybersecurity is making progress toward the future reality of sending data securely over long distances using quantum physics.

As an example, my team at Battelle, together with ID Quantique, has started to design and build the hardware required to complete a 650-kilometre link between Battelles headquarters and our offices in Washington DC. We are also planning a network linking major U.S. cities, which could exceed 10,000 kilometers and are currently evaluating partners to work with us on this effort. For the past year, we have used QKD to protect the networks at our Columbus, Ohio headquarters. But were not alone when it comes to quantum-communication efforts. Last month, China started installing the worlds longest quantum-communications network, which includes a 2,000-kilometre link between Beijing and Shanghai.

Many nations acknowledge that zeroing in on un-hackable data security is a must, knowing that even the best standard encryption thats considered unbreakable today will be vulnerable at some point in the future likely the near future. QKD is the best technically feasible means of generating secure encryption. Yes, it has its challenges, but continued innovation is tackling these issues and bringing us closer to the reality of long-distance quantum rollouts and truly secure and future-proofed network technology.

Does this mean that software-based methods wont have any value for network security applications? Of course not. One must always evaluate the cost of the protection against the cost associated with the loss of your data. But part of that evaluation must include the certainty of the security solution. So, while post-quantum cryptography and QKD may both be secure enough for a particular application, we use QKD when we want to know that our data is secure, without having to rely on unproven assumptions that it is.

In the long run, we envision an integrated network that includes software-based methods, which we call Tier III (cost conscious), alongside higher-security and commercially viable QKD (Tier II) solutions that use quantum methods with Trusted Nodes to distribute keys, but conventional encryption (AES, for example) to protect actual data. In this vision, there is also one higher level Tier I (very secure, very expensive) that uses quantum repeaters to transmit long, quantum-based keys and one-time-pad encryption to protect our highest value data, mostly government and military information.

QKD is an attractive solution for companies and organizations that have very high-value data. If you have data that you want to protect for years, QKD makes a lot sense. I think youll see this distributed across the country to protect that high-value, long-duration data. This is the future.

Read this article:
The Future of Security: Zeroing In On Un-Hackable Data With Quantum Key Distribution

Brandis boosts vetting of APS staff to prevent insider threats

Australian Government agencies will be required to vet their staff on an ongoing basis in order to protect sensitive government data against the kind of "insider threat" posed by the likes of Edward Snowden and Bradley Manning.

Attorney-General George Brandis this morning unveiled revised mandatory requirements for how agencies should screen employees, which could potentially see periodic staff security assessments replaced by dynamically pushed information, to keep tabs on staff on an ongoing basis.

Brandis recently directed his department to review the existing personnel security policy under the Australian Governments protective security policy framework (PSPF), which sets out the controls government agencies are expected to take to protect their people, information and assets.

The changes to the personnel security policy aim to reduce the risk of loss, damage or compromise of Commonwealth resources by providing assurance about the suitability of personnel authority to access those resources in response to risks posed by insider threats such as Edward Snowden and Bradley (now Chelsea) Manning, Brandis said.

They aim to minimise the potential for misuse of those resources either by inadvertent or deliberate disclosure, he told delegates at the Security in Government Conference today.

To address the risks that could arise from a trusted insider, the importance of security vetting, contact reporting and ongoing monitoring of our employees suitability to access information should never be underestimated.

Brandis also asked the Attorney-Generals department to explore vetting in a paradigm of evolving threat, specifically dynamic vetting in which information about an employee requiring clearance is pushed to the vetting agency, rather than being provided by the employee themselves.

There is a need to change our focus from point-in-time suitability assessments to continuous monitoring and assessments of each persons ongoing suitability, Brandis said.

The new and emerging threats we face require Government to constantly revisit and revise our approach to national security. This should be extended to personnel security and vetting, where it is not enough to simply tick and flick an application every few years.

We must take a dynamic, not a static approach, to the assessment of suitability."

Read the original:
Brandis boosts vetting of APS staff to prevent insider threats

David Klann Talks About Using Open Source Software in Broadcast Radio (Video)

Tim Lord: David, what is community radio and how did you first get involved with that?

David Klann: Yeah, thats a great question. There are lots of different kinds of radio stations out there, the most common of course is the commercial radio station where they sell time. It is kind of like with Google the product is not the Gmail the product is youthe user. In radio, the product is the listener. And then theres public radio. In my home state, we have Wisconsin Public Radio, theres National Public Radio. And then theres community radio. Community radio stations are typically independent. They are typically run almost completely by volunteers. Ours has three part-time paid staff members, and the rest of the station is run by volunteers.

Tim: Tell everyone the name and the frequency of your community radio.

David: Oh sure. The radio station that I am associated with is WDRT. It is in the Driftless region of Wisconsin and we are on 91.9 FM and wdrt.org.

Tim: How big a radio does that actually take in?

David: Compared to some it is a small station. We are 480 watts and we cover about a 25-mile radius around the tower. So it is a pretty small geographic footprint but we like to think that we are making a huge impact in the community.

Tim: Running a radio station is a lot different from people doing person-to-person communication, as in HAM radio.

David: Yeah, right.

Tim: What are some of the complications? What are some of the equipment that you use for instance? How do you get a signal from soup to nuts, how do you actually put a signal out on an FM station?

David: Sure. I think the main thing is that the FCC is heavily involved. I think it is partly because these things are such high powered. Even at 500 watts we are far more powerful than a lot of HAM radio outfits and certainly more powerful than the old CB radios, more powerful than the cell network, individual radios on the cell network, I think partly because of the large power output and also because of the limited spectrum that was originally allocated for FM radio. Radio stations, unlike other over-the-air wireless communications, radio stations first of all, they are one way. It is all being sent out from a source. So at our stationand this is pretty typical of radio stationswe have all the input devices, microphones, turntables, tape players, CD players, computers, iPods, whatever people bring to the station, all that gets funneled through what we call the audio chain. At some point, right before it leaves the station, we digitize it, and we send it to two places: We send one half, not really half, but we send one copy of it up to the transmitter. We use a leased Ethernet line for that. And then we send another copy of it out to the stream on the internet. And so our internet stream, and our FM broadcast are identical. In the chain from the studio to the transmitter, youve have got an encoded piece of audio that gets sent up over the Ethernet. At the other end, at the transmitter end, it gets decodedit turns back into analog audio and then is sent to the transmitter just via coaxial cable.

The rest is here:
David Klann Talks About Using Open Source Software in Broadcast Radio (Video)

What is cryptography? – Definition from WhatIs.com

Cryptography is a method of storing and transmitting data in a particular form so that only those for whom it is intended can read and process it.

Cryptography is closely related to the disciplines of cryptology and cryptanalysis. Cryptography includes techniques such as microdots, merging words with images, and other ways to hide information in storage or transit. However, in today's computer-centric world, cryptography is most often associated with scrambling plaintext (ordinary text, sometimes referred to as cleartext) into ciphertext (a process called encryption), then back again (known as decryption). Individuals who practice this field are known as cryptographers.

Modern cryptography concerns itself with the following four objectives:

1) Confidentiality (the information cannot be understood by anyone for whom it was unintended)

2) Integrity (the information cannot be altered in storage or transit between sender and intended receiver without the alteration being detected)

3) Non-repudiation (the creator/sender of the information cannot deny at a later stage his or her intentions in the creation or transmission of the information)

4) Authentication (the sender and receiver can confirm each other?s identity and the origin/destination of the information)

Procedures and protocols that meet some or all of the above criteria are known as cryptosystems. Cryptosystems are often thought to refer only to mathematical procedures and computer programs; however, they also include the regulation of human behavior, such as choosing hard-to-guess passwords, logging off unused systems, and not discussing sensitive procedures with outsiders.

The word is derived from the Greek kryptos, meaning hidden. The origin of cryptography is usually dated from about 2000 BC, with the Egyptian practice of hieroglyphics. These consisted of complex pictograms, the full meaning of which was only known to an elite few. The first known use of a modern cipher was by Julius Caesar (100 BC to 44 BC), who did not trust his messengers when communicating with his governors and officers. For this reason, he created a system in which each character in his messages was replaced by a character three positions ahead of it in the Roman alphabet.

In recent times, cryptography has turned into a battleground of some of the world's best mathematicians and computer scientists. The ability to securely store and transfer sensitive information has proved a critical factor in success in war and business.

More here:
What is cryptography? - Definition from WhatIs.com