Lightweight models and cost-effective scalability

May 12, 2010

“There will be aggregator businesses and content or specific service businesses.” (Jason Fried, 2004). Tasty Planner is one of those ‘content’ businesses. It’s essentially a recipe finder/sharer and a meal planner. Users can join and upload or find recipes, schedule meals and generate shopping lists.
Tasty Planner has followed best practice by implementing a variety of different revenue models. Firstly, it offers sponsorship opportunities – companies can pay $400 for a week of having only their advertisements appear on certain pages of the Tasty Planner site (however, this seems to exclude a block advertisements from Google that also appear on those pages). Although this brings in money for Tasty Planner, it could be argued that some this is a bad move since the ads from sponsoring companies aren’t necessarily contextually relevant and could distract from the purpose of the site (the ads from Google appear to be contextually relevant).

Secondly, Tasty Planner lets its users create accounts. The first type of account is free and offers most, if not all, of the site’s functionality. The second type of account is only available for a fee and lets users use the site without seeing ads. Whilst some fees are enough to turn away potential customers who expect online services for free, the Tasty Planner rates are so low (a dollar per month) as to likely avoid repelling these users. This is a clever strategy as it ensures that either way (whether the users are viewing ads or paying for freedom from ads) the site is still bringing in money. It also provides the flexibility to support different types of user – those who don’t want to pay can still use the site.

Thirdly, Tasty Planner has discreetly included a small image linking to its ‘pledgie’ page. ‘Pledgie’ appears to be a site where people can donate amounts to different ‘campaigns’. In the case of the ‘Tasty Planner campaign’, this seems to be not working so well as there’s a notice stating that no donations have been made yet. That said, the theory behind this is sound since they have little to lose by including this option and (potentially) a whole lot of money to gain.

Fourthly, Tasty Planner lets people license it. This means it offers customisable versions of it to people wanting something similar to its current form. In their own words:“Imagine you are a recipe book publisher, you can have your own version of Tasty Planner customized with your design and only your recipes. Those who buy your book would get much more than just a list of recipes, therefore giving you a competitive advantage. Imagine a collection of 50 cooking books, owners of your cooking books would have an interactive way to plan their weekly meals with the recipes contained in the book. This is a reason for shoppers to buy your book in favor of the next one on the shelf, and this is just one example” (Tasty Planner). This is a clear, clever example of providing outsource infrastructure to people seeking pre-packaged solutions. The Tasty Planner model is successful – successful to the extent that other people are willing to pay for a version of it customised to meet their specific needs.

This organisation encourages viral marketing by offering ‘badges’ (small images linking back to the Tasty Planner website) for recipe contributors to display on their own web pages. All that’s necessary to incorporate one of these images is to copy the code provided and paste it into your own site’s code. Tasty Planner suggests using the badges for, amongst other things, blogs and MySpace accounts. Again, this is a low-risk, easily implemented opportunity for attracting more users (and, therefore, potentially more revenue).

Part of the way Tasty Planner saves money is by having most, if not all, of its content contributed by its users. Outsourcing this task not only reduces cost but enables the acquisition of content that may otherwise have never been found or uploaded so quickly, creating a massive, rich collection of data that can then be served back to the users. What’s more, the diversity and sheer amount of data helps to draw in more users, thus continuing the cycle of user contribution and user attraction.

Another way Tasty Planner has reduced costs is by having very few staff – on its ‘about page’, it states that the site is developed and maintained by only four people, despite having almost 10000 contributing members. This sort of site scales naturally as more users means the opportunity for better content.

Overall, Tasty Planner is an excellent example of a lightweight application and business model.

Advertisements

Leveraging the ‘Long Tail’

May 3, 2010

You can find everything out there on the Long Tail” ~ (Chris Anderson, 2004)

The idea is that, while a small number of products make up a large quantity of sales, there are many products in relatively low demand that don’t sell well on their own, but which together can outsell the more popular products.” (Danny Bradbury, 2005)

The two quotes above both help to explain the key concepts of the “Long Tail” as popularised by Chris Anderson back in 2004. Many businesses are awakening to the fact that with decreased production, distribution and support costs, selling niche market products is becoming not merely viable as a business model but profitable. ‘Stuff On My Cat‘ demonstrates some of the characteristics and best practices of ‘Long Tail Businesses’. In short, Stuff On My Cat is a site where users can upload pictures of their cats with stuff on them (toys, clothes, etc). It was started in 2005 and has since started selling related products (books of some of the photos submitted, clothes and other articles with pictures on them, etc). It sells only niche products but with decreased costs making business viable, it has found its place targeting a little segment of the ‘long tail’.

If you want to take business away from established competitors who ignore the Long Tail, you need to service the Long Tail, but doing so is going to cost you, and you can mitigate these costs by creating a market. In a market the service provider for one of your customers is another customer, not you. The key to getting customers to serve each other’s needs, is to create an environment where both sides of each transaction are serving their own self-interest.” (Francisco Gutierrez, 2005) SOMC is a clever example of the creation of a market served by its users since the products SOMC sells are partially created by users (based on the photos submitted by the users). This means those same users are likely to be interested in buying the products because they feel a sense of ownership and because they are similar/relevant to the users’ interests. In this case the self-interest driving users includes the sense of importance and benevolence that arises from contributing and helping other users.

The image above has been created to apply specifically to software and services, but the principles apply to Web 2.0 applications as well. SOMC exemplifies these.

  • Democratising the tools of production – in the case of SOMC, the tools of production are cameras and an internet connection, which these days are so prolific many or most people have access to them.
  • Democratising distribution – SOMC products are sold both over the internet and in physical stores, making them available to everyone with internet access and many people without it. SOMC also employs a form of viral marketing to attract more users – it includes links below each image that help them to immediately share the image with their friends through email, Facebook, Myspace, Digg and many others.
  • Connecting consumers and producers – since most of the people who would be interested in buying SOMC products would be those who know about the site, all SOMC has to do to connect them with the supply of products is make sure visitors to the site are aware of its online store. Once it’s done that, almost the entire market has been reached.
  • Democratising monetisation – SOMC not only sells its products, it sells targeted advertising space.

SOMC has studied its users and has a clear profile of the ‘typical user’ accessing their site (young, caucasian and female) and how many unique visitors they get a month (about 130 000). They have published these (and other) user statistics and that they are willing to sell advertising space on their site to attract businesses who are looking to target that demographic.

The site has a detailed FAQ section and a forum where users can find the answers to their problems on their own. Also, the process for submitting content is automated. All of these factors leverage customer self-service to decrease support costs and so help to make the site a viable business but the site also provides contact details for user who haven’t found the solution through the channels above.

SOMC depends substantially on the ‘wisdom of the crowds’ to moderate content and verify content submission. Users can ‘flag’ inappropriate content (abuse reporting) to bring it to the attention of (and possibly deletion by) SOMC staff. In addition to this, only registered users can submit photos – this helps to protect against robots and other automated submission software. SOMC also uses wisdom of the crowds to rate content – users can assign a ‘five star’ rating number to an image.

Perpetual beta

April 25, 2010

Software development processes are experiencing a shift from occasionally releasing new versions with huge changes and lots of new features to frequently adding small changes to their products. Applications being developed in this way are described as being in a ‘perpetual beta’ stage. Nathan Wallace presents the diagram below as illustrating the perpetual beta development approach and says about it “Adopting this cycle brings certainty and momentum to end users while ensuring continuous improvement and low risk release management for developers. Developers move seamlessly from one version to the next, with only a small cross-over for bug fixing during the Beta period.” (Wallace, 2007)

Release cycle diagram

An example of perpetual beta software is the suite of programs by CloudBerry Lab. These programs demonstrate some of the characteristics and best practices of the perpetual beta approach. CloudBerry was founded in 2008 but has several versions out already. This shows frequent releases and therefore some adherence to the best practice of ‘release early and release often’ (in examining the site, I didn’t find evidence of what stage (how developed/bug-free) the new features were at when they were released, so I can’t tell how ‘early’ these releases were).

Another characteristic of ‘perpetual beta’ software is the continual addition of features in place of ‘version’ releases and again, CloudBerry partially conforms to this. Whilst it does release in ‘versions’, the number of changes introduced with a new version is relatively low, meaning the ‘versions’ could perhaps be classified as small, continual updates and not substantially different ‘editions’. One of the benefits of changing things slowly is that it gives users a chance to adjust to the new features instead of throwing them into a completely unknown environment.

CloudBerry also mentions ‘coming soon’ features, indicating that the software is not currently complete and remains in a beta stage. This could be helpful in a few ways – firstly by creating anticipation and a sense of activity/progress that catches and holds the users’ attention and secondly by helping to lessen the shock and bafflement of the announced changes when they arrive.

Beta testers are important because it is almost impossible for developers to test their software in all of the various conditions that might occur” (Digital River, 2009) Beta testing is an enormously important part of any software production and, interestingly, CloudBerry seems to implement it differently at different stages of software development. In the earlier stages of development, CloudBerry invites people to sign up as beta testers. This is conformance to the best practice of ‘engaging users as co-developers and real-time testers’ and ‘split testing’ (where only a portion of site visitors are shown ‘coming’ or ‘in development’ features and asked to provide relevant feedback). After that, however, the option to become a beta tester is removed (presumably when the software is first formally released).

Once the software has been officially launched, CloudBerry lets it fall into the ‘version release’ pattern (as addressed above) with new features continuing to be added in small clusters. At that stage, a link is still provided that takes visitors to a place where comments can be left and read (meaning users can continue to give feedback).

Provide testers with incentives to communicate and provide feedback. Beta testers that provide comprehensive feedback will often welcome, and expect, a complimentary release copy.” (Digital River, 2009) On the page where visitors can sign up to be beta testers, CloudBerry makes it clear that they will “grant a limited number of free licenses to the most active beta testers who will provide the most valuable feedback”.

Beta testing isn’t always only beneficial to the software developers. Thomas Nau stated that “doing betas and implementing new OS releases early helped a lot in system administration and overall utilization of the machines due to new tools and features. He described his positive experience in one sentence: “Betas ensure that you stay on the leading edge of technology.” (Nau in ‘Get the best benefits from using beta software’, 2008) By making users aware of the benefits to themselves, software developers may be able to attract more beta testers and so improve the feedback they receive about the beta software.

Overall, though CloudBerry doesn’t implement all of the best practices related to perpetual beta software, it does include quite a few of them and relatively successfully so.

Software above the level of a single device

April 18, 2010

Useful software written above the level of the single device will command high margins for a long time to come.” (David Stutz, 2003).

Although the quote above appeared in 2003, the principle it articulates remains true today. It recognises that the computing world is experiencing a shift from desktop-bound client software to net-based applications that are accessible on a range of devices from the desktop to the mobile phone as well as a host of other devices. An example of this is an application called PocketSmith – an online budgeting calendar – as it exhibits some of the best practice principles put into action.

The World Wide Web Consortium states that “For some web content or application to be device independent, it should be possible for a user to obtain a functional user experience associated with its web page identifier via any access mechanism” (W3C, 2003) and PocketSmith implements this by maintaining its users’ accounts and data in such a way that they are accessible from both desktop and mobile devices. By not binding the users to a particular device, PocketSmith expands what it has to offer and appeals to a range of user types. Not all users would have a mobile device or a desktop and therefore would not be able to make use of all those access options but for those who do, the functionality is there and waiting.

If you ask me what computer I use, it’s the closest one. If you ask me where my data is stored, its in the web.” (Digital Equipment Corporation, 1997) This quote is apparently another old one, but it neatly captures the ideal of location independence. What’s more, with mobile devices being so advanced these days, the closest computer is often right in your pocket, so by offering mobile access, PocketSmith has made itself location-independent. As long as users have a compatible mobile device, they can access their data from anywhere their devices receive network coverage. Indeed, PocketSmith in particular makes use of the immediacy of mobile internet use – users can check their budget from any place at any time (for example while out shopping, a user could check to see if their potential purchase would fit within their budget). In this way, PocketSmith cleverly leverages the pervasiveness of mobile computing and internet – people seem to always have their internet-enabled devices with them and are therefore constantly online.

Another best practice that PocketSmith adopts is modifying the experience to suit the device – instead of presenting the same graphics on a mobile phone that a user would see on their desktop, the iPhone version of the application is much cleaner and simpler and only the core content is displayed.

Other best practices (such as harnessing the collective intelligence) haven’t been implemented because they involve online communities and the PocketSmith application deals with its users individually. In fact, trying to create community around its users would likely backfire since the application is centred around users’ financial data, which users almost certainly don’t want publicly available. Perhaps there would be room to use that data in aggregate, non-identifying form (say, to show the community the budgeting areas that large groups of people are struggling or succeeding with or room for a forum where users can trade advice and encouragement) but currently, the application is for individual or family use.

Rich user experiences

March 28, 2010

Fortunately, technologies exist that enable you to deliver a better-than-browser user experience without having to manually install code on client machines” (Thompson, 2007)

We expect to see many new web applications over the next few years, both truly novel applications, and rich web reimplementations of PC applications.” (O’Reilly, 2005)

The push to introduce desktop application-style functionality into a web application, combining desktop and online benefits, is growing, exemplified by a number of recent Web 2.0 applications such as Creately. Creately offers many powerful tools to create diagrams and work on them collaboratively with colleagues through a web browser. It provides an extremely rich interface that resembles desktop software much more than it does the slow, static web pages that used to be the norm on the internet and is a perfect example of the ‘rich web reimplementations’ O’Reilly mentions.

One of the best practices for Web 2.0 apps is to design around simplicity. Creately, however, has a complicated interface filled with such a wide range of tools that provide complex functionality and options such that a typical user may never even touch some of them. On the other hand, the interface design draws upon standard software appearance conventions (such as placement of menus and recognisable icons to make it easy to learn. It should also be noted that one of Creately’s selling points is that it supplies all the tools users might need, not just the basics. It’s there for users looking to create (sometimes collaboratively) complex diagram files. Like Photoshop is to Paint, Creately is to primitive diagram creation programs.

Aral Balkan listed (amongst other user interface design principles for web applications) “Don’t sell what you can’t deliver”. He elaborates – “Users must not be given Graphical User Interface (GUI) expectations that cannot be met (or can only be partially met) within a Web User Interface (WUI). Whenever OS or GUI expectations are set, they must be fully met. That said, the application must try and meet OS expectations as much as possible, especially for ergonomic features such as keyboard shortcuts and navigation but also for expected auxiliary helpers such as tooltips.” On the main page of the Creately site, the main image is of its complex user interface, clearly selling a powerful, OS application experience. What’s more, (as the promotional demonstration video shows), Creately delivers, sitting confidently at the feature-heavy end of the spectrum and meeting the expectations it instills.

Another best practice is the provision of effective search functionality and here again, Creately deviates from textbook ‘best practice’. Although it provides no prominent search feature, it does offer categorised and easy to peruse help links. Also, the site is based more upon creating data than searching for it and the main information likely to be searched for is help on how to use the tools available. The content on the site is basically entirely centred on supporting the functionality and, as such, is made up mostly of help topics and information about the company.

In this case almost every diagramming tool imaginable has been incorporated but instead of this being an example ‘because they could, they did and consequently decreased usability’, it’s a positive selling point since Creately is supposed to satisfy its customers by offering all that a user could want, especially the features that might normally be left out of similar programs for being too obscure or unlikely to be used much.

In summary, although Creately appears to break many of the ‘best practice’ rules, it does so with good reason and demonstrates a successful embrace of the growing trend towards providing extremely rich interfaces and experiences through web browsers. It might be burdened with a massive range of features, but I see it as targeting users that want such a toolset and that Creately has been constructed to meet their needs very smartly.

Additional sources: http://blackboard.qut.edu.au/webapps/portal/frameset.jsp?tab_tab_group_id=_2_1&url=%2Fwebapps%2Fblackboard%2Fexecute%2Flauncher%3Ftype%3DCourse%26id%3D_61779_1%26url%3D

March 21, 2010

Innovation in Assembly

A current trend on the web is a shift from closed, proprietary applications to open platforms. More and more websites are offering services along with their content to the extent that the become similar to OSs in providing a range of functionality. An example of this trend in action is Force.com – an SaaS (Software as a Service) cloud computing collaboration platform. Not only does it offer its users space to upload, manage and collaborate on files/data, it has applications to support this work. What’s more, these applications were not all developed by Force.com – through many measures (particularly the provision of a development environment and support) users are enabled and encouraged to build their own apps and it is the resulting bounty of apps that Force.com uses to attract new users.

This follows best practice of the ‘Innovation in Assembly’ pattern and is beneficial to Force.com because without effort on the part of its development team, the functionality it can offer is expanded, thus increasing its value and attractiveness to users. These user-developers can contribute innovative ideas that the employed developers might not have come up with. Also, because it’s the users creating the applications, Force.com can be assured that these are applications that users actually want so time isn’t wasted on developing features that aren’t adopted by the user-base. User-developers of Force.com also benefit because they gain a richer experience customised to meet their specific needs. Because Force.com is a platform, not an application, is is easily scalable as user-developers upload their own apps.

Another best practice of the ‘Innovation in Assembly’ pattern is the provision of APIs and Force.com adopts this, offering a range of APIs and actively encouraging as many people as possible to become developers. Both SOAP and REST protocols are supported.

Designing to allow components to be reused is identified as another principle of best practice and Force.com implements it by making available applications that manage small tasks that developers can integrate into their own new applications. Reuse of these component applications is encouraged through the provision of tools to search through and ‘test drive’ them, making finding and integrating an appropriate application very easy. Building on each other’s contributions, user-developers can easily and swiftly create powerful applications.

Tim O’Reilly says “we believe that Web 2.0 will provide opportunities for companies to beat the competition by getting better at harnessing and integrating services provided by others” (O’Reilly 2005) and Force.com is an excellent example of this as its entire business model is built upon the extension and recombination by its users of component applications developed by other people. By facilitating and encouraging user-developer contribution by supplying APIs and component applications to help in the creation of new applications, it attracts an ever larger user base which I think should see it continue to increase its hold on the market and prosper.

Sources:

Lecture notes: http://blackboard.qut.edu.au/webapps/blackboard/content/contentWrapper.jsp?attachment=true&navItem=content&content_id=_3108288_1&displayName=Week4&course_id=_61779_1&href=/%40%40/AB9799711834A665195B234E2C858AF6/courses/1/INB347_10se1/content/_3108288_1/INB%2528N%2529347%2520Week4%25202010.pdf

Data is the next Intel Inside

March 14, 2010

Data is the next Intel Inside

I admit it being somewhat surprised by the extent to which data has gained value, even surpassing that of software in some ways. Yet data is an increasingly precious commodity and there are many examples of Web 2.0 applications capitalising on this, one of which is MyDataNest (a data storage company offering users a space to store their data of for free or, if they want advanced features included in the deal, for a fee). MyDataNest also offers editing tools which customers can use on their stored data, but it should be noted that these ‘applications’ are of lesser importance compared to the data storage/access are generally functionally centred around that storage/access anyway.

This type of business can be classified as a Data Infrastructure Strategy as it provides “infrastructure for storing and accessing others data”. Rather than spending large amounts of money to create specialised, hard-to-obtain data of their own or finding marketable ways of accessing such valuable data, this company simply creates space for customers to stash their data for safe-keeping – a business model seizing the opportunities at the bottom layer of the internet stack.

It is difficult to tell what level of ‘lock-in’ exists for those who register with MyDataNest, since it does not seem to present its policy on users migrating their data to its competitors. This brings up a key issue of the data-centric industry – to what extent do companies prevent their users from shifting to competitor companies. Chris Messina appears to hold up Google as an example of a company that seeks to monopolise the market by drawing users into its own proprietary platform and then shutting out third party services. Tim O’Reilly suggests that this practice is not, in fact, best practice, and that best practice dictates that users should have control over their data and be able to move it to wherever they want. A term for this is ‘open data’, and Tim Bray says that “any online service can call itself “Open” if it makes, and lives up to, this commitment: Any data that you give us, we’ll let you take away again, without withholding anything, or encoding it in a proprietary format, or claiming any intellectual-property rights whatsoever” (link) The data-centric industry is relatively new, so the terms to describe it aren’t fully set yet, but I think O’Reilly’s definition is quite perceptive and usable.

Another point to note about MyDataNest is that while it specifies what sort of data is collected from its users and how it is used, it does not say who owns that data. This touches upon another key issue relevant to data-centric businesses – the ownership of data. In examining this issue, it should be noted that there are at least two types of data in question – the data users upload themselves (be it files or personal registration data such as names and addresses) and the aggregate data that companies can glean from users’ provided data and behaviour. Sometimes the question of ownership is made easy by means of a signup contract or similar that explicitly states who owns what but not all situations are governed by such contracts and it’s then that things get complicated.

To be honest, I don’t know the answers for the question of ownership – I can see legitimate positions on both sides. One might argue that since the data is derived from users, users have a right to at least access it, if not have some say in what is done with it. On the other hand, the companies who retrieved, combined and analysed the data could claim that they were just like other researchers and that the results of their research are theirs. Tim Bray addresses this issue as well, saying “a service could also say: We acknowledge your interest in any value-added information we distill from what you give us, and will share it back with you to the extent we can do so while preserving the privacy of others. (link) This seems to me to an idealistic rather than legalistic response and depending on your opinions on the entire open data concept, it may seem more or less desirable and appropriate.

Harnessing collective intelligence

March 7, 2010

Harnessing Collective Intelligence

As I understand it, Web 2.0 is all about opening up content creation to everyone and this has brought with it enormous benefits and problems.

For me, the most prickly issue is probably the balance between allowing participation while retaining control. As an individual, I know that I like to contribute to conversation and interaction centred on topics of interest to me and that my participation creates in me a sense of ownership and affection for the result of that conversation/interaction – letting people become a part of a ‘project’ helps gather a support base for that project that can continue to support and expand it. This collaboration offers the opportunity to ‘harness the collective intelligence’ of these contributors but at the same time leaves the project at the mercy of contention arising from differences of opinion and personalities.

Andrew McAfee identifies one obvious example of successful a self-moderating community – Wikipedia. When I stopped to consider it, I agreed with his initial reaction of disbelief that a storehouse of knowledge could exist in an accurate, unbiased form when it was open for everyone to edit. But on examining Wikipedia further, it appears that such an encyclopaedia has indeed been created and maintained as such. According to McAfee, this is because the community knows that any dodgy entry will be removed or changed so that people generally don’t even bother to make them.

Another issue with harnessing collective intelligence is getting a population to start contributing in the first place. This has been a particular problem for corporate managers/leaders seeking to latch onto the benefits Web 2.0 technologies by incorporating them into workplaces (for example, trying to get employees to start contributing to blogs or wikis). McAfee outlines a few points needed for a successful introduction of Web 2.0 technologies but also comments that, even once the new practices have been accepted by a work community, trouble remains for the managers – maintaining direction. Since these technologies are driven by a community, managers have a difficult line to walk between directing contributions and stifling them. (Some managers permitted off-topic contributions on the grounds that it would encourage further use of the system). Giving users permission to help gives them permission to change the project.

In addition to encouraging contributions with the lure of participation, Web 2.0 technologies can gather contributions implicitly – that is, by setting up the system so that by simply using the technology for their own goals, users at the same time add value to the system. An example of that (as identified by Dan Bricklin) is Napster – when users download songs and leave them in the shared folder, they make those songs more available to other users. This increasing range of music attracts more users, some of whom make new songs available, thus attracting more users and perpetuating the growth of the system with minimal to no ‘active’ effort on the part of the users.

Once these technologies are adopted by a large number of users, the differences amongst the users establishes another benefit – diversity in categorization. An example of this is the multiple ways of naming the same file, thus making it more likely that a search will retrieve the correct file under at least one of its names.

When I reflected on harnessing the collective intelligence in the ways listed above, I became aware that the more freedom and permissions the users were given, the more they added value to the technologies they were using but, at the same time, that I was highly reluctant to share authorship of projects that are personally important to me. I think this highlights a critical dilemma in the development of Web 2.0 technologies – how to give users freedom and still retain a degree of control.

Hello world!

March 1, 2010

Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!