Author: Kyrie Canille

Working in South Asia is amazing for so many reasons, and one of them is getting to know a lot of innovators and entrepreneurs right where they are. We weren’t satisfied with the fact that there are so little information about those companies and entrepreneurs, because we know that there is so much to learn from Southeast Asian startups and their founders.

To start this quest to consolidate that knowledge, we thought it would be best if we got the information straight from the horse’s mouth. We spoke to David Fallarme, Inbound Marketing Director from ReferralCandy, and he shared with us good insights. Check it out![/vc_column_text]

Initial Stage

How did you get your first 1,000 users?

David: Since the very beginning, we had a strong focus on publishing regularly on our blog. Lots of quality posts on an active blog led to a lot of search traffic, which led to people signing up for ReferralCandy. As for specific tactics: we liberally used Brian Dean’s skyscraper technique for creating content. It’s a great framework for thinking about creating blog posts that get traffic.

What were the best performing acquisition channels?

David: Organic search has always been an important channel for us, hence our focus on producing content in our vertical and continually making SEO tweaks. Platform partnerships were also important in jumpstarting ReferralCandy’s growth. In our case, we got our app on the Shopify app store just as it was really taking off. When you see an opportunity to piggyback off a growing platform — jump on it, especially in the early days.

Growth Hacking Asia Startup Interview with ReferralCandy

Growth Stage

What were the biggest challenges you faced while scaling the company?

David: Finding new, sustainable channels is an interesting challenge. We’re past product-market fit and trying to profitably find as many people in our audience as possible. So most of our challenges center around either tweaking existing parts of our acquisition machine to make them more efficient, or running experiments to unlock new, unexplored channels.

Did your focus on user acquisition channels change over time or were your initial channels scalable enough?

David: Our acquisition strategy hasn’t changed, it’s more matured. For example, organic search was a key channel in the early stages, and it continues to be very important today. But we’re looking to diversify that; it’s always scary when Google announces any changes or even just sneezes in your general direction. So our acquisition channel strategy still has a heavy focus on search, but we’re also aware that we need to continually diversify.

(…)it’s always scary when Google announces any changes.

What was the split between paid and organically acquired users? How did this change over time?

David: We started with organic, with some marginal amount being spent on paid just to have a presence. But we’re doing more and more paid acquisition now. We have a much better handle on all of our funnels, and now that we’re bigger, we’ve got more budget to spend, which means we can run more and more experiments and learn faster.

What were the success metrics you focused on during your growth stage? Did they change over time?

David: We started with a focus on reliably generating website traffic and signups. We just wanted to get people in the door and learn along the way. As our growth ramps up, our spend also increases — so as we try to scale up we’re keeping a much closer eye on unit economics. Our metrics now are more centered on CAC, LTV, payback period and ROI per channel.

What is the biggest mistake you made and what did you learn from it?

David: Not prioritizing campaigns & data reporting. It’s hard to run growth experiments when you don’t have the measurement frameworks in place. Most people skip this step. For example, they install the plain Google Analytics code and they think it’s done. I would love to be able to go back in time and tell myself to be meticulous in setting up events, setting up funnels and goals, and ensuring that I’m regularly using UTM values for as many campaigns as possible.

Thanks, David, for sharing ReferralCandy’s story!

Growth Hacking Asia Startup Interview with ReferralCandy

If you’re a startup founder based in SEA looking for help with growing your business, let’s have a chat about growing your startup through an experiment and data-driven approach.

Contact us for more details!


“If you can’t measure it, you can’t improve it”: Peter Drucker

All too often, in an effort to be ‘data driven’, teams fall into one or more of the following traps:

  • Tracking the wrong things, or not tracking things in enough detail, leading to data that is not providing actionable insight
  • Tracking everything possible, without a plan on how to leverage it, knowing that it might be useful at some point. This can lead to code bloat, over-use of network connection in the app and potentially wasted dev cycles implementing or updating redundant tracking
  • Failing to effectively structure and process the data that is collected, leading to a mounting heap of unused or surplus analytics data that fails to add value
  • Analysis paralysis: spending tons of time slicing and dicing data, but failing to act on it (often exacerbated by unstructured or surplus data)
  • Failing to keep analytics up to date with new product updates, leading to data which does not reflect product usage accurately

Source: Minimum Viable Analytics – The Mobile Growth Stack

Email marketers, I’ve got some good news and bad news for you. Let’s start with the encouraging news: according to new data from Yes Lifecycle Marketing, the average email open rate (currently 16.1%) has registered a 9% year-over-year increase and a 3% increase over the last two years. So congrats, your subscribers want to open your marketing messages!

Triggered emails are a highly effective way to build a relationship with subscribers and boost engagement.


Find the full article here: Boost Subscriber Engagement with Triggered Emails | Yes Lifecycle Marketing

Use Google’s Experiments to enhance your website’s engagement and conversion metrics


Last week, we’ve shared a few tips for A/B Testing, but how should we start one?

We apply all the web design best practices but without any visible performance improvement. You’re not sure of what it is, and you need to know how to apply more continuous testing to maximise the engagement and conversion metrics of your website.

In this article, we’re going to explain how to set-up a split test inside Google Analytics in a few minutes.

The goal is always the same: make them get what they want as fast as possible while getting them closer to one of your conversion needs.

Google Analytics comes with a basic experiments feature that allows you to compare different variations of a page and split the traffic accordingly between them. Keep in mind though that you’ll need quite a bit of volume to get statistically significant results (rule of thumb is to have min. 100 conversions but this can vary quite a bit) and be able to decide if the changes actually worked or not.

If your traffic numbers are still low, the most common way to generate cheap traffic fast is through Facebook ads. In our next article, we’ll share with you a deep dive on how to set up and run them the most effective way.

But How Can You Run an A/B Test in Google Analytics?

Setting it up takes only a few minutes. Once you decided what you need to test, you can get started.

A.) Get Started

Under the BEHAVIOR tab, you can find the Experiments tab. If this is your first experiment, , you’ll probably see it like this:

Setting up A/B Test in Google Analytics
Click on Create Experiment on the top left.

B.) What should you experiment?

Name your experiment with whatever your objective is. Here is where you can set a detectable outcome to check results and see what’s the best variation possible.

AB Test - Create New Experiment
Here you might:

  • Select a Site Usage data (like bounce rate, shown here)
  • Select an existing goal (like purchases, opt-ins etc)
  • Create a new objective or goal

This depends on what you’re testing in the first place. You’ll be surprised by all the metrics related to your website. You can see the bounce rate, for example, on Behavior > All Pages. On our test we’ll use Bounce Rate as Objective.

Google Analytics behaviour reports

By default, all those advanced options are off and Google will always “adjust traffic dynamically based on variation performance”.

Let’s keep going?

C.) Configure your experiment

Now you add the URLs for all the pages you need to test the variations. Just copy and paste the links, like this:

The New Google Analytics Content Experiments

You can give names so you can remember them easily. I’ve named them like the above.

D.) Script code

Editing the page’s code can look scary at first but it actually quite easy. The first thing you see under this section is a helpful toggle button to email the code snippet to your developer.

Setting up your experiment code
AB Tests step by step in Google Analytic, adding script code to your page

If you’re doing it yourself, make sure to double check the pages you’re testing to make sure that your default Google Analytics tracking code is installed in all of them.

Next thing, you highlight and copy the code provided. You’re going to look for the opening head tag in the ORIGINAL variation (which will be located on the top of your HTML document).

Search for <head> to make it easier for you.

Once that’s done, click NEXT STEP in Google Analytics and they’ll verify if everything is ready to go. If it’s not, they’ll let you know.

Inspect element in Google Chrome
Experiment code validation

And… voilà!

Don’t forget that you can only make an important decision once your experiment reached minimum 95% statistical significance. You can measure it using this very helpful tool.

What we can learn from this is that websites are never 100% finished! We always need to experiment with new ideas and analyse data to keep increasing our conversion goals.


In the introduction to this series I made the point that Product Market Fit isn’t the only thing that matters. It is actually only one of four fits needed to grow a product to $100M+ in a venture-backed time frame.

While Product Market Fit isn’t the only thing that matters, it is important, so it makes sense that there are no shortage of blog posts explaining Product Market Fit, and how to get it.

Instead of echoing the many great Product Market Fit explainer posts out there, I’m going to focus on the 5 elements of Product Market Fit that I believe are most misunderstood and overlooked:

  • The wrong way to search for Product Market Fit.
  • Why we should be thinking about it as Market Product Fit.
  • How we defined our market and product hypotheses for early versions of HubSpot Sales.
  • What the search for market product fit looks like in reality, not just in theory.
  • Qualitative, Quantitative, and Intuitive signals of market product fit.

Source: The Road to a $100M Company Doesn’t Start with Product — Brian Balfour’s Coelevate

How incremental design changes can improve the seller experience to drive listing growth & engagement

When moving apartments in Singapore I tried Carousell, a Southeast Asian peer-to-peer marketplace similar to Letgo (US) & Mercari (Japan). I was instantly impressed with the ability to monetize my unwanted belongings with speed and simplicity. Every sale felt like I was getting free money since the other alternatives were 1) letting items collect dust at home or 2) donating / throwing them away. From then on, I used Carousell extensively to buy used items for my new flat and later to sell most of my belongings when I left Singapore (adding $5000 to my bank account).

Source: Carousell: a Guerilla UX Case Study –

A/B Testing is when we run simultaneous experiments between two or more pages to see which one (or more) converts to better results. Although it’s called A/B, it can be performed with multiple pages.

All you need to do is create different variants of the page you want to run the experiment on (e.g. reduce number of form fields, change layout of elements, change CTAs, etc.) and split incoming traffic evenly between the different variations. Once you have collected sufficient data (i.e. statistical significance), you can decide which variant performs best and should be kept.

a/b testing

Since growth hacking is all about running experiments to optimise the entire customer journey, running A/B tests play a crucial role in identifying what works best for your company and what doesn’t work.

In A/B testing, you have two sets of web pages, banners, email subject lines, etc. and test them against each other. Typically, the one that’s being used currently is version A (called control) and version B is the new one to be tested. You divide the traffic between those two variations and compare their performance based on the success metric you decided to use for the experiment.

For example:

if you want to compare whether the number of form fields on your signup page makes a difference, the metric to look at is your conversion rate:


What elements to experiment with?

What you choose to test depends on what your data tells you (both qualitative and quantitative). E.g. the problem could be that your visitors don’t understand:

  • Your value proposition
  • What you want them to do (unclear CTA)
  • ….and much more

The elements that are most commonly tested are:

  • Headlines and/or value propositions
  • Images on pages and landing pages
  • Call to action: words, size, colors, where it’s placed, etc
  • Layout and style of the webpage
  • Forms length
  • Layout of product page elements

A Few Examples of A/B Testing


In the below example, the winning version (second image) changed the two-line headline into a one-line headline plus a sub-head. This increased the sign up rate by 38%.

Use of images

Using a picture of a person instead of an icon doubled the conversion rate. Scientific research says that we are usually attracted to images with people, subconsciously, especially of babies and smiling females.

Call to action

Small changes in your CTA can make quite a significant difference. In the below example, the CTR increased by 90% just by replacing “your” with “my”.


Online shops usually deal with a lot of people quitting their checkout due to many forms and pages. An A/B Test might detect that and it can prove to you that a single page checkout can work better than a multi-page check-out process.

Full lenght

Ask as little as you can and be direct on your forms. If you ask like a paragraph, the reader will feel compelled to fill all the blank spaces – giving you exactly what you need.

Layout of product page elements

Adding trust elements such as a customer review widget can significantly increase your conversion rate, in the below example by 36.73%.

Learn to use your tools

There are many different tools and resources that might help you while you’re A/B Testing. Here are a few of our preferred ones:

  • Google Website Optimizer: It’s free and it’s from the most-known search website ever known, but it still misses a few features. Good to start!
  • A/Bingo: Requires programming and integration in code for Ruby on Rails developers.
  • Unbounce and Performable: Landing-page creators with integrated A/B testing.
  • Which Test Won?: A game where you guess what variation won in an A/B Test
  • Tips for A/B Testing: Get some tips, tricks and ideas for your next A/B Test
  • A/B Ideafox: a good search engine for A/B test and many case studies.
  • a place to share and check some A/B test results out there