The case to build your own Adtech

Apr 23, 2025
Adtech ai

Several years ago, I had the pleasure to write a short piece for this very blog about building vs. buying your own tech stack. Since then, our (small but mighty) agency took the road less traveled and instead of buying and integrating multiple solutions together, we opted instead to build it from the ground up. While we’re not quite at the end of our journey , I thought this would be a good time to take stock and share some of what we’ve learned along the way, and hopefully inspire you to consider going down the same path we did.

Background

One of the biggest hurdles for any media buying agency is the need to consolidate and streamline data for reporting and analysis, and like many agencies throughout the early 2000s, we ended up manually aggregating data into fairly large, cumbersome Excel files. We could usually tell when one of our team members was compiling reports when we could hear their laptop fans working overtime due to the sheer amount of data being processed. Regardless of how adept we became with pivot tables and lookup formulas, getting data from the platforms was a  bottleneck.

While we inherently knew that technology was the solution, we were always hesitant to develop our own. Firstly, unlike our multinational conglomerate cousins, we don’t have the resources to have an internal team dedicated to developing our own technology. In fact, for an agency our size, I was the only one with the technical depth and expertise to code. Secondly, we had no intention of becoming an AdTech solutions provider – there were others who dedicated themselves to this already, and were doing it better.

Why decide to build instead of buy?

There are three reasons to support this approach.

Firstly, just like the conversations we’re having about AI, there is an inherent benefit to reducing the time spent on low-value tasks (i.e. data aggregation) and investing in high-value tasks (i.e. anything else).

Secondly, we have fairly unique teams and processes which means many of the off-the-shelf solutions would still require a substantial level of customization.

Thirdly, aggregating and storing our performance data in our own database allowed us to tap into Generative AI capabilities more readily, and more securely. With the mounting concern of proprietary data being used to train 3rd party AI, having better control of our performance data tipped the scales in favour of building our own data platform.

Vision first, technology second

I recommend that you have your end vision front and center. It was fairly straight forward to work backwards when the goal was clear. It’s important not to just identify the core components needed to get to the finish line, but also all the forks in the road and other intermediate stages where new functionalities and features could be introduced to improve our day to day activities – all without committing code yet.

We got excited. Maybe too excited, but at least we had a plan.

Not having realized it at the time, having an overarching vision turned out to be a very good idea. It forced us into making specific development decisions (development language, cloud platform selection, amongst others) that turned out to streamline code development, database architecture and provide many unforeseen benefits. While we still made many mistakes and questionable decisions along the way, we likely mitigated many major ones.

Crawl, walk, jog…and eventually run

Having little to no internal development experience is a mixed blessing. On one hand, we aren’t weighed down with “too much” experience where we would avoid pursuing certain goals outright, but we have just enough experience and hubris to think anything is possible. What we did have was organizational maturity to know that slow, methodical development was both beneficial for us to learn about the various platform APIs, authentication schemes and intricacies of the data we can collect, as well as to slowly manage the change that would come from weaning ourselves off of annoying, but well-defined and familiar processes.

We started off small – focusing on our most-used platforms and learning to manually call their APIs. Instead of storing the data directly in a database, we opted to generate reports in real-time and have it sent directly into Google Sheets or an Excel file. This allowed a better understanding of typical data requirements and needs. It also helped us understand which platforms had better APIs, and which ones would prove problematic in the future. This test and learn approach allowed us to manage expectations, scale and introduce new and  better ways of doing things.

Over the first two years of development and use, we started to fall victim to our own successes - more users began using it frequently,  despite leaning heavily into stable, scalable cloud solutions like Google Sheets, we were running into scalability issues and limits from free solutions. It was time to move on and start putting some real investments into our own cloud instance.

With our newfound insights and better understanding of platform APIs, we were able to streamline and develop our own data pipeline into our cloud database, develop a basic web interface and roll out an alpha version of our internal tool within 6 months – substantially faster than anticipated. In the intervening months, we continued to develop our capabilities, using our campaign data as our backbone, and building tangible, useful features into our platform. While we’re not quite ready to tap into Generative AI capabilities, we believe our current setup allows us to run quickly should we want to.

Would we have done anything differently?

Looking back, I fundamentally believe we made the right choice in developing our own ad technology in-house. Not only are we re-shaping how our teams approach our campaigns, we are also laying down the groundwork to improve our effectiveness and efficiency for the next few years. While we made our fair share of mistakes and had some setbacks in developing our own tool, there are only a handful of things I would have done differently, and my hope is that you can learn from my mistakes.

  • Data structure and database design matters – performance, stability and speed is inherently tied to how your database is structured. A poor database structure can dramatically set you back both from a usability standpoint and from a time standpoint in re-engineering your database.
  • Think in platforms, not instances – building specific, ad hoc tools may help with the immediate problem, but without a strategic vision and end goals, you may run into interoperability issues between different tools, or creating multiple, competing systems. This will not only drive-up cloud usage, but also increases the time needed to maintain different sets of code. By developing code as platforms that can be re-used, we shortcut a lot of re-development costs and annoyances.
  • Look into AI development tools – while we use CoPilot for GitHub internally, there are a myriad of AI-amplified tools available. While AI may not generate better code than the true rockstar developers out there, as a competent developer, it has dramatically improved the volume of code I’ve been able to produce. The AI can understand the code patterns I’ve written and replicate them faster. However, for weaker coders or individuals using inefficient code, AI tools will likely underwhelm and reinforce bad coding behaviour.

I don’t believe there is a universally right answer to the initial” Build or Buy” question that I set out to answer years ago. That said, a deep understanding of your internal processes, resources and friction points will ultimately help guide you to the right decision for you, even if you’re not currently ready for it. In many ways, we need to move beyond simply asking the question of what can we do better now and challenge ourselves with the greater question of how can we position ourselves for success in the future, and what are the pieces we need to get there. The hope is that the answer will cut through a lot of the current uncertainties and move us towards better processes, better insights, better decisions, and better outcomes.


AUTHORED BY
profile picture

Sam Leung

Vice President, Aber Group Inc.




UPCOMING EVENTS & LEARNING OPPORTUNITIES

|

VIEW ALL

Carousel title 2

/

Recent Work |

View All
Council
Council
Council
Council
Council
Council
Council
Council

Major Sponsors

  • CIBC-800x450
  • Microsoft-2023