In the March, 2008 issue of Wired Magazine, Chris Anderson wrote an article positing that the marginal cost of technologies which drive the Internet (and, by inference, software) will approach zero over time. That timeline is shorter than many think. Because the incremental cost of the next bit of hard drive space is too small to dicker with and the number of people over whom that cost is amortized is so large, Yahoo! can offer unlimited e-mail storage and YouTube can offer unlimited file uploading.
There are certain points at which this theory does not hold. A crucial concept of the Anderson article is that there must be enough consumers to amortize the costs over to bring the marginal cost to be effectively zero. Building a specific application for one person or a small number of people will incur significant cost because it cannot be used over and over again. Furthermore, laws such as Moores Law can not extend indefinitely, as they will eventually hit the limitations of physics.
However, effectively, what Anderson posits means that software should be a depreciating asset, similar to a car. Cars lose value over time because they dont run as well as they did while they were new, and newer cars run more efficiently and effectively. The same can be said for software. A good example is spreadsheet software. Lotus 1-2-3, introduced in 1984, cost $499 for the PCjr. In 1985, Excel was released for $195; today, it costs $229 (cheaper in inflation-adjusted terms than the 1985 version), and has exponentially more power. Today, OpenOffice and Google Docs have free spreadsheets.Yet, despite the declining value of purchasing proprietary software, the U.S. government almost always pays license fees and doesnt own code of software that its vendors produce for it. Rather than owning code and being able to get incrementally cheaper upgrades to take advantage of the increasing power and decreasing unit cost of technological improvements, the government is stuck with rapidly devaluing and depreciating software. We certainly understand the risk-aversion and lack of desire to be on the bleeding edge that many government agencies have, but to buy seven and ten year licenses seems (and often not even have rights to the code after that time), in light of the Anderson article, to be counterintuitive.
Here are 3 steps government contracting officers could take to improve the situation:
-
Require ownership of the code in the RFPs/RFQs: Many companies bank future profits on licensing fees. Make them earn profits on quality software instead.
-
Limit length of contracts for maintenance: If computing power doubles every 18 months and storage capacity doubles every two years, why lock into a timeframe that gets overcome by events?
-
Write requirements that adapt to the future, not reflect a near past: Demand that software be capable of hitting standards on the next generation of systems, not just the existing one. It will force scalability, modularity, and clean development.