The problem with manual optimization and what to do instead

Did you miss a session at the Data Summit? Watch On-Demand Here.

This article was contributed by Jeremy Fain, CEO and cofounder of Cognitiv.

Manual optimization, as marketers know all too well, is a hassle. It is difficult, stressful, tedious work that takes forever – and just as you think you have finally cracked the code, something happens and your entire strategy needs to be reconfigured. Yet, despite its inefficiencies, manual optimization is still the process of choice for the majority of advertisers. Why is that, and what can advertisers do to make their optimization process more efficient?

To understand why marketers are choosing to optimize manually, we first have to look at the evolution of programmatic. When it first came onto the scene, programmatic was groundbreaking because it enabled both advertisers and publishers to automate their media buying, and gave advertisers the opportunity to purchase ad inventory quickly and at lower prices. In theory, these lower prices were made possible by the lack of middlemen involved in the buying and selling process. 

Of course, theory and reality are often completely divergent from one another. Although programmatic made it easier for advertisers to purchase ad placements and publishers to sell them, it also created ample opportunities for fraud — not to mention that publishers and marketers were constantly getting hit by hidden tech fees. Additionally, advertisers were struggling to control where they were advertising, leading to concerns about brand safety.  

It was thanks to this state of affairs that the self-service platform was born. Now, instead of handing media-buying off to an algorithm that can adapt in real-time, advertisers are making those decisions for themselves, using tools like excel spreadsheets and, if they are advanced, basic decision trees to figure out what to do next. While self-service gives advertisers more transparency and control over ad placement, it comes with its own set of problems. In some ways, it has actually made marketers’ jobs more difficult by requiring traders to be data scientists without any data science experience. 

For instance, advertisers have to slog their way through vast quantities of data if they want to really find the right patterns to target.  This type of data analysis takes a long time if done properly – and traders have neither the time nor the resources to do it right. By the time the analysis is done, the findings are no longer relevant or the new tactic only works for a short period of time.  In order to make use of these insights effectively, marketers have to be able to implement them in as close to real time as possible – which, when working manually, is virtually impossible to do.

Another issue with manual optimization is the fact that it is inconsistent. You might stumble across an effective strategy one day, only to try it again the next and discover that your results are completely different. This is generally because the patterns found are because of proxy data that hide the true reason the advertising is working.  Manual optimization is a constant, inefficient game of trial and error and prevents advertisers from being able to scale effectively. To put it another way, manual optimization forces advertisers to work in the short term while leaving very little room to map out a long-term strategy. 

What advertisers need is a happy medium between the first iteration of programmatic – programmatic 1.0 – and the mess that self-service has become. In other words, they need to bring back some level of automation to make their lives easier and improve results, but without sacrificing transparency. 

Thus, the rise of a new way of buying: the automated custom algorithm. There are a few companies doing this now, but to truly take advantage of data and find the scale necessary for long-term success, advertisers need the power of deep learning.  Deep learning algorithms can make decisions in real-time about which placements to buy and which to avoid. There are several benefits to this approach: namely, the fact that deep learning is capable of carrying out data analysis and making predictions on a far more sophisticated level than humans can — and much like humans — can learn from its mistakes and avoid making them again. 

Deep learning has changed the way we interact with technology over the past few years — self-driving cars, Instagram effects, augmented reality, and speaking to your devices are all examples of deep learning-enabled tech.  For advertisers, deep learning means they no longer have to worry about delivering performance at scale; as the algorithm learns more about how audience members behave, it can improve its predictions, thus leading to a reduction in costs and increased ROI over time. Finally, deep learning-enabled algorithms can optimize buying decisions in real time, giving traders the time and freedom to oversee all of their campaigns effectively and concentrate on big-picture strategies instead of daily tactics.

To put it plainly, there is really no longer a need for marketers to perform manual optimization. It is inefficient, expensive, and ineffective, and everyone’s time would be much better spent elsewhere. It is time for us as an industry to embrace the next stage in the evolution of programmatic — one that makes marketers’ lives easier while still delivering results.

Jeremy Fain is the CEO and cofounder of Cognitiv.

Originally appeared on: TheSpuzz