1 January 14, Evaluating Open Source Software William Cohen NCSU CSC 591W January 14, 2008 Based on David Wheeler, “How to Evaluate Open Source Software / Free Software (OSS/FS) Programs,”
2 January 14, Why Evaluate Existing Software? ● Determine if suitable software already exists ● Select the best software possible ● Make use of already available work
3 January 14, Evaluation Steps ● Identify candidates ● Read reviews ● Compare leading candidate programs for basic attributes ● Analyze top candidates in more depth
4 January 14, Identify Candidates ● Search sites OSS software packages: ● Freshmeat ● SourceForge ● Savannah ● FSF Software directory ● Use Search Engine
5 January 14, Effective Searching ● Search engine possible biases: ● May not show software competing with their offering ● Paid advertising put before the links of interest ● Search terms: ● Identify key words and combinations ● Use previous found package names ● Use data formats the that software might use, e.g. jpg
6 January 14, Read Reviews ● Make use of other people's experience with the software: ● May be faster than locally installing the software ● May compare various software packages you are already considering ● Search for reviews view search engines ● Review biases: ● Publisher of review may accept ad revenue ● Software fans may bias votes or comments on software
7 January 14, Software Market Share ● Hard to determine how many copies running ● Do not have to report the number of copies running ● Downloads can hint at relative use, but not absolute: ● One download may be used many times ● Software fans may influence download numbers ● Data on network visible attributes: ● Websites indicate the software used to build them ● Web browsers identify themselves, e.g. Mozilla, Internet Explorer ● Look at what distributions include the software package
8 January 14, Compare Candidates for Basic Attributes ● Reduce list of possible software down to “likely candidates” ● Examine the OSS project web pages for basic attributes: ● Brief description of project ● Frequently Asked Questions (FAQs) ● Project documentation ● Mailing lists ● Links to related projects
9 January 14, Basic Attributes ● Functionality ● Cost ● Market Share ● Support ● Maintenance ● Reliability ● Performance ● Scaleability ● Useability ● Security ● Flexibility/Customizability ● Interoperability ● Legal/License issues
10 January 14, Functionality ● Does it work on the desired platform, e.g. Microsoft Windows, Apple OS X, or Linux? ● Have a list of functionality needed/desired ● Does the program do what you want? ● How hard would it be to add missing functionality to OSS?
11 January 14, Cost ● “Free” in Free Software refers to liberty ● May be nominal costs for getting software on media ● Setup costs: ● Initial installation ● Migrating existing data ● Training ● Hardware needed ● Staffing ● License fees ● Upgrade/maintenance
12 January 14, Market Share ● Are significant number of people using the software? ● Large user base: ● Many people find the software useful ● More likely support of software
13 January 14, Support ● Help with training, installing, and fixing issues with software ● Is the documentation readable and useful? ● Community support: ● Generally free ● FAQ, IRC, and mailing list ● Commercial Support: ● Companies ● Consultants
14 January 14, Maintenance ● It the project mature? ● Is there active work on the project? ● Is there a bug/issue tracking system? ● Are issues with the software being addressed in a timely manner?
15 January 14, Reliability ● How does the website describe the maturity of the software? ● Are people using the software in production environment? ● Are there test suites to check that the software functions? ● Are the test suites run regularly to make sure that the code base continues to function?
16 January 14, Performance and Scaleability ● Performance ● How fast (or slow) is the software? ● Benchmarks posted may not be applicable to your application ● May need to benchmark when doing detailed analysis ● Scaleability: ● Are people using the software in similar sized configuration?
17 January 14, Useability ● How easy is it to use the software? ● How much work to setup new user? ● How difficult to do common tasks? ● Interfaces: ● Library Application Programmer Interfaces (API) ● Graphical User Interface (GUI) ● Command Line Interface (CLI) ● Does the software follow interface guideline?
18 January 14, Security ● Static analysis of OSS: ● Coverity ● Klocwork ● Security issues: ● MITRE's CVE ● Are developers analyzing and reviewing the code
19 January 14, Flexibility/Customizability ● How easy is it to adapt the code? ● Examples: ● GCC ability to port compiler to new architectures ● Plugins for web browsers
20 January 14, Interoperability ● Able to work with standard formats ● Important for data exchange with others and use of existing data ● Examples: ● Image manipulation systems ● Word processors ● Databases
21 January 14, Legal/License issues ● Some End User License Agreements (EULAs) have clauses that may be unacceptable: ● Compliance audit ● Providing vendor with private information ● Available source code doesn't guarantee OSS, check the license, copyright owner may restrict copying ● License information (OSS license compatibility): ● OSI ● FSF ● Restrictive vs. Non-restrictive OSS license ● Legal suites ● Software patents
22 January 14, Evaluating Attributes ● Attribute importance varies on application ● Market Share may be relatively unimportant for a research project ● Legal/License issues can be very important if plan to redistribute code
23 January 14, Analyze Top Candidates in Detail ● Have narrowed field to a smaller list of possible software packages ● Have firsthand experience with candidate software ● Examine code base for adding functionality ● Benchmark code ● Check reliability (Does it handle problems gracefully?)
24 January 14, Additional Information ● ● ●