StartupToDo.com Scholarships for St. Louis Startups

A few weeks ago I met Bob Walsh, well known MicroISV guru (he wrote the book on it). He has a startup-acceleration company called StartupToDo.com; which he persuaded me to take a look at. The site offers a pile of information, for a fee, to help startups “cut to the chase” as they get moving. I was initially skeptical, because there is such a vast amount of such information for free online.

Then I thought about it a while, and thought about various potential and actual founders I’ve met, and thought about how much time a person can spend browsing around for information, and looked through the guides on StartupToDo… and it now appears to be a worthwhile resource for first-time founders to use (and pay for). Speed is everything, StartupToDo could save a founder some hours.

At the same time, I’ve been looking for ways to help boost the nascent St. Louis startup and software-company community.

Putting those two things together, Oasis Digital (my firm) is going to sponsor (that is, pay for) a StartupToDo membership, for up to 10 St. Louis area startup companies. The rules are simple:

  • Must be a software product / service or related startup company (not consultant)
  • Must be located within 50 miles of St. Louis, MO
  • Either starting in the near future or in the last year
  • First 10 that meet these requirements, “win”
  • Act fast – you must “apply” by the end of December 2009.

Read the official details on Bob’s site, and follow his directions to “apply”.

While you’re at it, consider getting involved with ITEN-STL if you aren’t already. ITEN offers assistance of various kinds to St. Louis area startup firms, and I am an ITEN mentor.

Massive Parallelism and Microslices

I just read James Hamilton’s comments on “Microslice” servers, which are very low-power, but high CPU-to-wattage ratio servers. As he explains in detail, at scale the economics of this design are compelling. In some ways, of course, this is the opposite of another big trend going on, which is consolidation through virtualization. I reconcile these forces like so:

  1. For enterprises with a high ratio of emloyees-per-server-CPU, the cost factors tend to drive cost as a function of the number of boxes / racks /etc. This makes virtualization on to a few big servers a win.
  2. But for enterprises with a low ratio (lots of computing work, small team), the pure economics of the microserver approach makes it the winner.

The microserver approach demands:

  • better automated system adminstration, you must get to essentially zero marginal sysadmin hours per box.
  • better decompisition of the computing work in to parallelizable chunks
  • very low software cost per server (you’re going to run a lot of them), favoring zero-incremental-cost operating systems (Linux)

My advice to companies who make software to harness a cloud of tiny machines: find a way to price it so your customer pays you a similar amount to divide their work among 1000 microservers, as they would amount 250 heavier servers; otherwise if they move to microservers they may find a reason to leave you behind.

On a personal note, I find this broadening trend toward parallelization to be a very good thing – because my firm offers services to help companies through these challenges!

Is your work getting better?

We’ve all heard that life / business / progress are moving faster “these days” than ever before. This feels true to me (in the positive sense, I am no Luddite), but I am also leery of how easily each generation becomes convinced that it invented newness, change, and youth.

On the topic of technical design innovation, though, we are obviously living in an era of very rapid progress. Here is a great example:

nano-2005-2008

The question this raises for me, and that it should raise for you, if you are a field which is at all technical and competitive, is whether you are keeping up with the pace of the world around you. Compared to three years ago, is your work product (code, process, design, attention to detail, vigor) obviously better? How about the next three years?

Business of Software 2009: Excellent

I just returned from the Business of Software 2009 conference, and can summarize it as excellent. Here are some thoughts on specific bits of it, mostly interesting to people who were there.

  • Geoffrey Moore’s opening talk was an early highlight of the conference; I’ve often been disappointed when a well-known person from somewhat outside a conference’s focus is invited to talk, but it turned out that Geoff had ample highly relevant content. Most notably, his 9-point recommendation for small software firms is dead on.
  • It is highly likely that my next project will be in one of the 20-something categories that Paul Graham thinks will grow. I’m not sure if this is saying much, though, because his points were so numerous and broad.
  • Mat Clayton had strong points about A/B testing, but I felt a bit dirty merely being in the room for his list of “dirty tactics” for social networking promotion. I heard similar feedback from other attendees.
  • Don Norman’s talk was excellent, but would have been more excellent if it was a bit shorter and thus tighter.
  • My favorite talk of the conference was Ryan Carson’s. In conversations about his talk, I heard the idea of several directions that the essence of Ryan’s message was to trade off, to give up profits in order to do various good things instead. I strongly suspect, though, that Ryan is doing the best he can, i.e. the strategy he proclaims is also how he maximizes profits (for a company like his).
  • Paul Kenny talked about telling stories. You must do this. I can’t explain just how important this talk was, so I won’t.
  • Pecha Kucha was this conference’s name for lightning talks. As elsewhere, these talks are usually very dense and very good, because the format forces the speaker to discard all the slow parts, all the boring parts, all the exposition, and instead go directly for their key points. It works.
  • I noticed a large number of people using TweetDeck, and adopted it myself. It is a higher-mental-bandwidth way to consume Twitter and Facebook data streams, and is well suited to the a sane Twitter usage pattern of one short intense sessions per day.

I have only a few criticisms:

  • A few of the speakers went long. Though it would annoy the speaker, it would be much better for the conference if all sessions were promptly stopped on time.
  • Luke Hohmann’s talk on “Innovation Games” felt like a sales pitch for his company, even though he tried hard to talk mostly in general terms.
  • The schedule was a bit too dense. We needed more slack between / before / after, to discuss and absorb the information.
  • It would have been nice to have a talk address the business of custom software development.
  • The swag, in the form of a slanket / snuggie, is much too physically large for an event attended mostly via air travel. Of course I could have discarded it (and some attendees did), this would have felt like waste. I would have preferred if Neil had simply scrapped it and kept that money as profit.

SaaS is good; Software as a Product is also good!

A reader recently wrote to ask for advice about a product which started as a SaaS offering, but for which the technology and customer needs are pointing in another direction:

[…] As we evolved this product we started working more with directory services and really needed to move the app inside the corporate network for optimum functionality. We are just undergoing our first pilot of this product launched in this manner […] I really want to stay with a SaaS model but this is clearly a product. It installs on their gear, they maintain the systems etc. Have you heard of any precedent that would allow a product like this to be sold as a service?  […]  $x/desktop/year kind of thing.  However, most software comes with a perpetual right to use license and I’m not sure if it’s fair to me or my customer to sell them a product they install but that is licensed on more of a subscription basis. […]

First and most importantly, this reader has a very good product: a pilot project (with a real customer, I assume) who wants to pay for and use the software at hand. In this situation, the obvious advice is to figure out how your customers want to pay you, and let them pay you. That’s not very informative, though.

To get a handle on the situation here, consider the appeal of SaaS. In my experiences as both as customer an seller of software-service offerings, the key benefits of the SaaS model are:

  • Recurring revenue for the vendor, which facilitates a stable and growing business that makes payroll every time.
  • Reduced risk for the customer, who can stop using the service when the need arises (either because it does not work well, or because they no longer need it) with much less abandoned investment compared to an up-front on-premise installation
  • Better alignment of the vendor and customer interests

Based on need for your software to operate behind the firewall, it appears that that bulk of potential customers at this time (2009) indeed will prefer to install the software in house. It is technically feasible to implement SaaS in such a way that it authenticates (or federates) with existing internal software, but thus far that is not a popular way to go, particularly from a security point of view.

The question then, is how to get at least a portion of the benefits of SaaS, with on-premise installed software. This turns out to be fairly easy:

  • Sell the software as a “subscription”. It is fairly common to sell software with subscription pricing, with a perpetual right-to-use bundled in. Offer your product in a bundle something like this:
  • The software itself, including perpetual right to use it (but with the caveats below)
  • Installation assistance
  • Support
  • Upgrades
  • Bug fixes
  • Priced at $X per user per year

If you set the pricing so that year 1 is the same price as years 2..N, you have something very close to SaaS in, in terms of your customer relationship (though not in terms of the deployment architecture, of course). If you set the pricing so that years 2..N cost 10-20% of year 1, you have the traditionat purchase-then-maintenance enterprise software pricing model.

What if the customers pays for year 1, then stops paying but keep using?

This is a possibility, but is may not be a problem in practice. Enterprise products tends to require support and upgrades to keep working smoothly in their real-world environments (ongoing streams of business and technical changes), and enterprise customers who rely on such a product, generally value an ongoing relationship with the vendor. Some customers will use-but-not-keep-paying, but even then you can think of it as price discrimination, in that it provides a way for customers who need more value (ongoing support and updates) to get it at higher cost, while customer unwilling to pay for the ongoing subscription have a way to pay you for a lesser offering (just the first year, then they are on their own) at a lower cost (i.e. just pay for the first year).