A few online tools make it easy to compare criteria about software, side-by-side. Of course, you probably expect that I think TEC provides the mother of all evaluation tools for comparisons (true). But this is about some of the other guys. Two sites I like, which I recently came across might be useful to you if you’re scanning the horizon for high-level comparison info. The first is Opteros’s EOS Directory and the second is ITerating. Both approach the issue in different, complementary directions to TEC’s. Here’s a bit of what’s interesting about their approaches and why I think they can offer valuable supplementary information.
The EOS Directory is, as the name implies, a directory of enterprise open source products much more so than a full-on comparison tool. Nevertheless it’s a pretty smart directory. It features high-level comparison information. It lists software in a few general categories and alongside the listings, provides four types of software/project comparison criteria.
The first criterion is the Optaros rating, geared toward open source software projects. The Optaros four-star rating system purports to show how well-suited an open source project is for the enterprise. It appears to be based on at least four considerations: the project’s functionality, maturity, community, and a progress trend, which I believe are necessary paths into research on open source solutions. Plus, as I said previously, this information nicely supplements what TEC’s open source evals cover (feature comparisons or analyses of service characteristics).
Most EOS Directory ratings are indicated through lime-colored pie symbols, which are easy to understand. For example, the more green in the pie, the more apt a project’s functionality is to include what a midsize or large enterprise might need. The more lime in the community, the more active and numerous it is (and so I’d believe a safer choice in terms of future development). The trend indicator is interesting because it represents an expectation of how the project will progress in importance and improvement for the given criteria.
Say you’re not sure which open source content management system to spend time researching, at a glance the EOS Directory informs you that Alfresco provides a pretty decent and mature feature-set, has an active community, bright future, and as an added bonus, 23 user votes determined a user rating of three out of four stars (the linked user forum might show why people felt that way). My sole criticism about these ratings is that there doesn’t seem to be a way to figure out how the ratings were derived for the project–I’m not sure if the criteria are determined by the open source project, by Optaros staff, or some combination. Then again, I suppose asking that question means I’m looking for something that’s probably outside the scope of how the directory is intended to be used.
What about ITerating? I’m fascinated by this site because a few years ago I was imagining what a wiki-based evaluation system might be like. I was surprised that one didn’t already exist. Something wiki-esque could provide insight from vendors, analysts, consultants, users, and developers alike. There was an open source project called WikiLens, which enables its users to rate things like movies or music. I played around with those scripts for a short time, curious to see what it would be like to adapt to user-based software comparison and analysis. Now, there is ITerating, which is doing quite an impressive job wikifying a software comparison methodology.
ITerating lists a fairly extensive directory of software types. You can add features and group them in higher-level categories of functionality, which appear to support a few different sorts of ratings. The ratings are summarized with a weighted average of bars, stars, and values. ITerating adds something of a meta-rating system too, letting you vote on whether you think a review is useful or not. An important feature is that users add the functions/criteria on which to rate software, meaning a potentially wide group of people could be contributing their knowledge to the comparison.
There is a lot to like in the ITerating wiki. It opens up a comparison process to a wide and possibly unexpected audience. This could be a strength, helping visitors gain a variety of data but it could also prove a weakness. I’d love to see ITerating provide more background about the perspectives of each set of user ratings. Perhaps delineate and weight user scores based on whether the users are vendors, consultants, analysts, developers, clients, etc. I think this would help mitigate bias issues that could otherwise arise.
ITerating offers quick comparisons but sometimes doesn’t seem to have fully articulated feature-sets for making the comparison. It also doesn’t seem to allow you to do custom analyses based on the importance of your most valued criteria. Not knowing how complete the data is for a particular type of software makes it a bit difficult to truly depend on the site for an entire selection process (though mileage may vary based on the software you’re considering). Perhaps this is a growing pain for the young site–I’ll be curious to see how it evolves. This criticism may also be unfair in the sense that ITerating is a “directory and review” site, which it accomplishes without wading deep into the complexity of individual analyses or selection guidance.
Finally, both the EOS Directory and ITerating offer the possibility for a lot of great, user-based, real world feedback. That’s an exciting extra perspective. Their broad scope may also be more ambitious than matrix sites (for example, CMS Matrix or WikiMatrix). I’d recommend browsing ITerating or the EOS Directory if you want to do some high-level research from an enormous sea of open source projects and other commercial software.
From the perspective of a company trying to understand its needs, the software vendors able to meet them, and decide on the best solution, I think using multiple user comparison and directory sites could help inform your decisions and evaluation stance. These sites make useful supplements to the comparison analyses and needs-prioritization of TEC’s evaluation systems.
The EOS Directory is, as the name implies, a directory of enterprise open source products much more so than a full-on comparison tool. Nevertheless it’s a pretty smart directory. It features high-level comparison information. It lists software in a few general categories and alongside the listings, provides four types of software/project comparison criteria.
The first criterion is the Optaros rating, geared toward open source software projects. The Optaros four-star rating system purports to show how well-suited an open source project is for the enterprise. It appears to be based on at least four considerations: the project’s functionality, maturity, community, and a progress trend, which I believe are necessary paths into research on open source solutions. Plus, as I said previously, this information nicely supplements what TEC’s open source evals cover (feature comparisons or analyses of service characteristics).
Most EOS Directory ratings are indicated through lime-colored pie symbols, which are easy to understand. For example, the more green in the pie, the more apt a project’s functionality is to include what a midsize or large enterprise might need. The more lime in the community, the more active and numerous it is (and so I’d believe a safer choice in terms of future development). The trend indicator is interesting because it represents an expectation of how the project will progress in importance and improvement for the given criteria.
Say you’re not sure which open source content management system to spend time researching, at a glance the EOS Directory informs you that Alfresco provides a pretty decent and mature feature-set, has an active community, bright future, and as an added bonus, 23 user votes determined a user rating of three out of four stars (the linked user forum might show why people felt that way). My sole criticism about these ratings is that there doesn’t seem to be a way to figure out how the ratings were derived for the project–I’m not sure if the criteria are determined by the open source project, by Optaros staff, or some combination. Then again, I suppose asking that question means I’m looking for something that’s probably outside the scope of how the directory is intended to be used.
What about ITerating? I’m fascinated by this site because a few years ago I was imagining what a wiki-based evaluation system might be like. I was surprised that one didn’t already exist. Something wiki-esque could provide insight from vendors, analysts, consultants, users, and developers alike. There was an open source project called WikiLens, which enables its users to rate things like movies or music. I played around with those scripts for a short time, curious to see what it would be like to adapt to user-based software comparison and analysis. Now, there is ITerating, which is doing quite an impressive job wikifying a software comparison methodology.
ITerating lists a fairly extensive directory of software types. You can add features and group them in higher-level categories of functionality, which appear to support a few different sorts of ratings. The ratings are summarized with a weighted average of bars, stars, and values. ITerating adds something of a meta-rating system too, letting you vote on whether you think a review is useful or not. An important feature is that users add the functions/criteria on which to rate software, meaning a potentially wide group of people could be contributing their knowledge to the comparison.
There is a lot to like in the ITerating wiki. It opens up a comparison process to a wide and possibly unexpected audience. This could be a strength, helping visitors gain a variety of data but it could also prove a weakness. I’d love to see ITerating provide more background about the perspectives of each set of user ratings. Perhaps delineate and weight user scores based on whether the users are vendors, consultants, analysts, developers, clients, etc. I think this would help mitigate bias issues that could otherwise arise.
ITerating offers quick comparisons but sometimes doesn’t seem to have fully articulated feature-sets for making the comparison. It also doesn’t seem to allow you to do custom analyses based on the importance of your most valued criteria. Not knowing how complete the data is for a particular type of software makes it a bit difficult to truly depend on the site for an entire selection process (though mileage may vary based on the software you’re considering). Perhaps this is a growing pain for the young site–I’ll be curious to see how it evolves. This criticism may also be unfair in the sense that ITerating is a “directory and review” site, which it accomplishes without wading deep into the complexity of individual analyses or selection guidance.
Finally, both the EOS Directory and ITerating offer the possibility for a lot of great, user-based, real world feedback. That’s an exciting extra perspective. Their broad scope may also be more ambitious than matrix sites (for example, CMS Matrix or WikiMatrix). I’d recommend browsing ITerating or the EOS Directory if you want to do some high-level research from an enormous sea of open source projects and other commercial software.
From the perspective of a company trying to understand its needs, the software vendors able to meet them, and decide on the best solution, I think using multiple user comparison and directory sites could help inform your decisions and evaluation stance. These sites make useful supplements to the comparison analyses and needs-prioritization of TEC’s evaluation systems.
No comments:
Post a Comment