Empiricast post mortem

Published 2020-04-16.

In September 2018, Martin Holten and I founded Empiricast. As I’m writing this, it’s April 2020, and we have stopped working on it. I’d like to share what we did and what we learned so that both we and others can learn from it.

How it started

Because we were interested, we studied the science of forecasting and decision-making. As we learned more, we started noticing a disconnect between what the science tells us and what organizations were doing. In short, the cutting edge was (and still is) far ahead of most companies, government agencies and non-profits.

During the summer of 2018, we started playing with different ideas for how we could improve decision-making. In the fall, we decided to start a company to build software for organizations. The goal was to make it easier for them to make better decisions by improving their forecasting ability. For more information on our approach, check out our white paper.

Phases

We went through five major phases:

  1. Sales/market research
  2. Version 1 development
  3. Sales
  4. Version 2 development
  5. Promotion and market research (the financial industry)

First round of market research

We wanted some verification from potential clients before we started building anything. We found it surprisingly easy to get meetings, and about a handful were very positive. Ideally, we wanted to close at least one sale before developing anything, but we weren’t able to. What we were offering was fairly new to people, so we figured that we needed to show them what we meant.

With the benefit of hindsight, I now think we should have:

  1. Pushed harder to get to a clear yes or no. We didn’t realize it at the time, but positivity does not equal willingness to pay. If we had pushed harder, we would probably have learned more from these interactions.
  2. Sought out more resources and guidance on enterprise sales. It really is a craft, and it requires a lot of patience.

Version 1 development

Confident that the market was there, we set out to develop our first version. It went fairly well, but if we were to do it again, we would spend less time by picking a different set of tools. We went with a Python REST API on the backend and React on the frontend. That was fun and it worked well, but we spent a lot of time writing things that come out of the box in systems like Bullet Train. Also, having separate back- and frontends took extra time and didn’t make any difference in the end.

Sales

In February 2019, our prototype was ready for real users. We reached out to the companies we had spoken to earlier as well as many others. Over the course of about 6 months, we tried selling to a variety of industries. But we still weren’t able to close any deals. During this phase, it really dawned on us how difficult sales can be.

Version 2 development

After summer, we were pretty demotivated. But we still weren’t ready to give up. So, while we kept working on our existing leads, we made a second version that was easier to use and was also open to the public. We also started consulting to get some income.

Promotion and market research (the financial industry)

In January, the second version was ready, and we started promoting it to the public. We got a fair amount of visitors who stayed on the site, but we found it hard to convince users to create content (forecasts and comments) themselves.

We also wanted to test sales in the financial industry, since our new version was geared at time-series forecasting. So we spoke to a lot of people in the industry in Norway. The results weren’t encouraging, so we dropped it.

Lessons learned

We can never tell for sure what would have happened if we did anything differently. That said, below are some hard-earned lessons we are fairly confident in. I’ll preface by saying that many are fairly well-known, but reading about mistakes and making mistakes yourself are two very different things.

  1. It’s easier to sell painkillers than vitamins. We were trying to sell vitamins — something that would help you improve over time, but at some cost now.
  2. Sales are hard and take a lot of time, especially calendar time.
  3. Sales (or some other kind of traction) is the most important thing. Your product can be great, but if people don’t want it, you’re going nowhere.
  4. Build a complementary team. Martin and I work very well together, but we’re both fairly technically minded, and we don’t enjoy sales as much as some people.
  5. Get a really good understanding of your customer and their willingness to pay before building anything.
  6. Be wary of making something that is mainly valuable to large companies (like we did). They take forever to make decisions, and you need to spend a lot of effort on the sales process.
  7. Just because a company has a problem doesn’t mean individuals in the company are sufficiently motivated to have it solved. If you’re an individual in a large organization, you usually have very little influence over the overall performance of the organization, and it doesn’t matter that much to you anyway. But you may have a lot of influence over your workday and what your boss and colleagues think of you. In other words, make not what people should want. Make what they actually want.
  8. Decision-making and forecasting is hard — figuring what forecasts to make, making those forecasts and then incorporating them into your decision-making is far from trivial.
  9. Many people and organizations have “IT fatigue.” They have many failed projects behind them, and are wary of starting new ones.
  10. Large organizations often want to be innovative, so they are welcoming when startups reach out to them. But actually closing deals is a different beast to slay.

Could we have made it with a different product?

While our prototypes worked the way they were designed to, they still left a lot to be desired. We modeled forecasting questions, but we didn’t really model decisions, which is what people really care about. (See the philosophy of Ought’s Ergo project for more on this.) An alternative approach that would be interesting to try could look something like this:

  1. Start with an existing decision-making process — e.g. risk management in a large enterprise.
  2. Find a measurement of decision-making quality.
  3. Study how they are making specific decisions.
  4. Test improvements to their decision-making process. (Improved forecasting could be one such improvement.)
  5. Generalize to other companies and possibly domains.

Other things

Note that I have left out a lot of things, like technical discussions, administrative stuff, attempts to get funding, etc. But they’re not really important for the purposes of this post mortem.

I have started sharing some code related to forecasting — check out Python Prediction Scorer if you’re interested.

Conclusion

Thanks for reading — I hope you found it useful. While we’re no longer working on Empiricast, we’re still very interested in the domain. So feel free to reach out for a chat.