Online marketing for professional services has recently become more of a science than an art. This stems from the abundance of data at our fingertips and our ability to measure, process, and analyze this data instantly.
With this ability to measure metrics like impressions, clicks, and downloads, there is a clear opportunity to implement a time-tested process for experimenting with new marketing tactics.
Following the scientific method involves documenting your experiment’s process. This means if your experiment has positive results, you can then replicate the experiment into your everyday marketing.
In this blog post we will look at each step of the scientific method and apply it to one example of online marketing for professional services.
Here are the steps in the scientific method:
The purpose of your experiment is to solve a problem. For example, let’s say you have invested in creating quality downloadable content on your website, but you aren’t seeing as many downloads as you would like. Make sure you document your purpose in a clear and concise manner. For example:
Purpose: to increase the average monthly downloads of your company’s educational PDF guide
The end goal of your experiment is to increase the total downloads of your content, but how?
Once you’ve stated the purpose of your experiment, it’s time to do some research on the topic at hand. Think of this step as absorbing as much information as you can around the topic. For collecting this information, try these tips:
- Ask industry experts for tips
- Read authoritative blogs
- Download how-to guides and other information that offer help with increasing your content downloads
From the research you conducted, let’s say that some of the tactics that you found to increase content downloads include:
- Split testing different call-to-actions (CTAs), such as “download now” or “learn more”
- Targeting relevant keywords to build a pay-per-click (PPC) campaign
- Creating a content calendar of blog posts, videos, and social media promotion that direct traffic to your downloadable content
- Testing different CTA button colors
At this point you’ve identified the purpose of your experiment and done some background research on the topic. The next step will be to form a hypothesis, or an educated guess as to what the outcome of your experiment will be.
Because your research shows that there could be multiple tactics that you can implement to increase the weekly downloads of your content, you might have to conduct multiple experiments. It’s also important to note that, for the best accuracy, each tactic should be tested one at a time. Incorporating all of the above without testing each individually may produce results, but you won’t have an idea of which one performed the best.
For simplicity’s sake, let’s use the example of testing different CTAs for our experiment. In your research, let’s suppose you found that the color of the CTA button impacts the number of clicks. Depending on a number of factors, such as your CTA copy and your website’s color pallet, you may hypothesize that one color will perform better than another. Setting those variables aside, your hypothesis might look something like this:
Hypothesis: If I change the color of my CTA button from orange in month one to purple in month two, orange will have double the click through rate of purple.
Now that you have stated your hypothesis, it is time to construct your experiment. The purpose of this experiment is to develop a procedure that will prove or disprove your hypothesis. Because you want to measure the click through rate of two different button colors, this experiment will need to be repeated twice with only the color of the button (independent variable) changing.
*It is important to note that while you are conducting your experiment, try to keep the rate of your online activity (blog postings, social media activity) monitored so that you can replicate this at the same rate in month two of your experiment.
After the end of the first month of testing, record the metrics you will want to compare after you change the color of the button. In this experiment, you will likely want to look at total page views and total button clicks in order to calculate the click through rate (number of clicks / page views). After you have recorded this initial set of data, change the color of the button and proceed with phase two of the experiment.
The numbers may be misleading when analyzing the data, and it is important to focus on the right metrics within the context of the experiment. For example, the results of your research show there were a total of 93 clicks on the orange button in the first month and 95 clicks on the purple button in the second month. One might conclude that color doesn’t make a difference in the number of clicks.
However, when taking into account that there were 2650 page views in the first month and 5000 page views in the second month, the picture becomes clear. The orange button saw a click through rate of 3.5% while the purple button saw a click through rate of 1.9%.
While orange performed better than purple, it did not achieve the “double” threshold as stated in the hypothesis. However, all is not lost.
You’ve concluded that color does have an impact in the click through rate, and established a benchmark for click through rates of both a warm and cool color. Additionally, like all good scientists, your experiment opened the door to new possibilities. You might ask yourself, “If color has an impact on click through rate, which color is best?” The answer lies in your next experiment.
Learn more marketing tips to grow your firm in the free ebook, Online Marketing for Professional Services.