OnePeople

It's not about free, it's about freedom.

Archive for the ‘Open Source Software’ Category

What you need to know about the 2009 DOD OSS Memo

with 3 comments

In mid-October, the U.S. Department of Defense CIO released a memo on the use of open source software in the DOD. The Clarifying Guidance Regarding Open Source Software (OSS) was hailed as tremendous leap forward for open source software in the US Government. And indeed it is. At its heart, the memo is fairly simple. The basic points are:

  • This is not formal policy, just a clarification of policy that already exists.
  • OSS is COTS (Commercial, Off-the-Shelf Software) and the same rules that apply to regular software apply to OSS. In other words: you cannot disqualify an open source software product just because it is open source.
  • Further, the memo reminds us that COTS software has special status in DOD procurements, because you’re supposed consider commercial alternatives before writing your own.

The memo has been under development for 18 months, and can trace its lineage to the DOD-commissioned report by MITRE. You can think of the 2007 Navy memo as a kind of prototype for this document, which applies to all of DOD.

The memo’s Attachment 2, though, grows more bold, offering several specific benefits that open source software can offer the DOD:

Written by gunnar

November 13th, 2009 at 5:18 pm

Read My Ramblings About CONNECT

without comments

Here’s a really nice writeup on the CONNECT Code-a-thon at iHealthBeat. They quote me a lot, which is what makes it really nice.

Written by gunnar

September 21st, 2009 at 3:23 pm

US Courts: Open Source Will Make You Break the Law

with 4 comments

Most of you already know about the US Courts’ shameful profiteering through the PACER system. They charge $0.08/page for public court documents and in so doing stifle the public’s access to their own content. Not long ago, our friends at CITP released an open source project called RECAP. When you install this gem in your browser, documents your retrieve from PACER are deposited in a public archive, where they can be retrieved by everyone, at no cost, forever.

So no surprise that US Courts is now discouraging the use of the plugin. They stand to lose a lot of cash if these documents are free.

What surprised me first is this little gem:

A fee exemption applies only for limited purposes. Any transfer of data obtained as the result of a fee exemption is prohibited unless expressly authorized by the court. Therefore, fee exempt PACER customers must refrain from the use of RECAP. The prohibition on transfer of information received without fee is not intended to bar a quote or reference to information received as a result of a fee exemption in a scholarly or other similar work.

Written by gunnar

August 25th, 2009 at 1:08 am

My OSCON 2009 Talk on Open Source in Government

with 3 comments

The good people at O’Reilly have posted my Open Source in Government talk at OSCON 2009 on blip.tv. It’s also on YouTube. I’ll admit to cringing a bit when I started watching, but I’m pretty happy with how it all went. Here are the slides.

In the panel afterward, someone asked my why open source developers should be helping companies make money on open source software, or helping the military-industrial complex or the prison system. I completely sympathize. There’s no reason whatever that someone should help the military or the prison system if they don’t want to. Those were just the examples that I used. There are many opportunities to work with the government elsewhere, especially at the local level. A good way to start is by finding something that’s annoying or broken in your local schools or library, and use open source software to fix it. Open Source for America should be making it easier for people to find these opportunities. But more on that later.

Written by gunnar

July 26th, 2009 at 11:51 am

Open Source on the Battlefield

with 5 comments

Two soldiers in a hastily built watchtower.

Two soldiers in a hastily built watchtower.

In Iraq, Sergeant 1st Class Martin Stadtler had nothing. He was stationed near Mosul, at a base that covers 24 square kilometers. Surrounding the base was a wall, and at intervals along that wall stood watchtowers. Those towers were improvised; they were large concrete water pipes, stood on their ends.

Inside each tower is a pair of soldiers. They’re watching for insurgents. To communicate with the home base, they had standard-issue tactical radios. Unfortunately, these radios couldn’t reach home base — the base was too big. Soldiers had to play a game of Telephone to reach the base: one tower radios the next until they are finally in range of the home base. Obviously, this would not do.

Written by gunnar

July 22nd, 2009 at 4:09 pm

The NSA’s Security Challenge

without comments

Using open source software, the National Security Agency was able to gather a community of professional and amateur security experts together to make unprecedented security protections available to public.

The National Security Agency has a mission. It is not just the nation’s code keeper and code breaker, but it must ensure the security of the nation’s digital infrastructure. Ironically, it had a security problem: the ecosystem for software that was keeping top secret information secret was deeply broken. There was little competition, no innovation and this essential software was expensive, slow to market, and antiquated.

Multi-Level Security, or MLS, is a complex problem: how to allow data with many different security classifications exist on the same machine? MLS software is difficult to get right, and easy to get wrong. It is subject to a stringent certification process. Although useful in certain areas of the private sector, there’s really only one customer for this kind of software: government. Once you’ve deployed MLS software, it’s very difficult to move to another solution as every MLS system was different. These are near-perfect conditions for very expensive, proprietary software that doesn’t innovate.

Written by gunnar

July 22nd, 2009 at 2:21 pm

The Navy’s Standardization Problem

without comments

Using open source software, the US Navy was able to standardize the shipboard systems on its new destroyers, reducing the complexity of the ship’s systems and their reliance on proprietary real-time software. Wall Street now uses this same technology to execute orders predictably, without relying on vendor-specific hardware and software.

Every ship in the Navy is a floating data center. Computers run the ship, handle navigation, and track inventory. There are mail servers, databases, and everything else you would expect in a corporate data center. Unlike a corporation, though, the Navy also has weapons systems and radars. These systems are unique, since they must perform in a very predictable way: when you pull a trigger, you can’t wait for the computer to send an email. It has to happen right away. This determinism in a computer system is called “real-time” performance.

The Navy has already saved millions by moving to industry-standard computers and commercially available software. This real-time requirement flew in the face of this: the software is very expensive, and often very proprietary. Frequently, real-time systems require specialized hardware and specialized software, which was also expensive. These new systems also meant special training for the operators. So this meant two sets of infrastructure: one to regular applications, one to run the real-time applications. This was expensive and inefficient, especially since a Navy ship is so constrained by the lack of space. It would be much easier to have the regular computers handling the real-time work.

Written by gunnar

July 21st, 2009 at 2:13 pm

Washington Monthly on Open Source in Healthcare

without comments

If this is the future of computing as a whole, why should U.S. health IT be an exception? Indeed, given the scientific and ethical complexities of medicine, it is hard to think of any other realm where a commitment to transparency and collaboration in information technology is more appropriate. And, in fact, the largest and most successful example of digital medicine is an open-source program called VistA…

– Phillip Longman, “Code Red

Written by gunnar

July 15th, 2009 at 7:33 am

Patents, Video, and an Open Internet

with 10 comments

For a number of reasons, I’m fascinated by the fight over the <video> tag in HTML5 as related by Ryan Paul of Ars Technica – and not just because I like the idea of not having to install a plugin to watch video online.

On the technical side, it’s mind-boggling to think about the possible consequences of some of these decisions. You have Google suggesting that the wrong codec would demand more bandwidth to run YouTube than is available on the entire Internet. That’s a big number. I am sincerely glad I’m not the engineer who has to manage changes at that scale.

More optimistically, you have the prospect of having native support for video in every browser, without paying or contracting with Adobe for the privilege. That’s exciting.

The friendly rivalry between Theora and H.264 is also neat to watch. I think it’s great that the Theora folks are responding to Apple and Google’s quality and performance concerns. It sounds like addressing their objections has made for a better standard.

Written by gunnar

July 5th, 2009 at 6:31 pm

Open Source and Open Standards

with one comment

Open standards are motherhood and apple pie – they ensure a level playing field in which many implementations can compete against each other, keep the barrier to participation low for newcomers, will outlive any given company, and ensure that systems can communicate with each other with a minimum of fuss. In other words, open standards create efficient and durable markets.

Open standards also keep costs low for buyers, who have many options and a minimum of friction when they want to switch from one implementation to another. Because the standard is open, there is no danger of being locked into a single vendor since anyone can create a new implementation against the standard. Since open standards will always exist, there’s no danger of the standard disappearing, becoming unsupported, or being later made proprietary. An open standard will encourage these efficient, durable markets for as long as the standard is useful.

Written by gunnar

July 3rd, 2009 at 7:50 pm