Scaling Managed Advertising

Designing for manual involvement in the social advertising space
Project Overview
If you look at numbers alone, expenditure on online advertising has caught up to TV advertising.

With all the advanced formats and tracking available to marketers these days, how could it not be? But with more knobs to turn and new variables to track launched every few months, many companies don't have the in-house resources or expertise to keep up.

Ampush answered this by providing analysts to partner with client marketing teams, run digital campaigns (using our internal software, AMP), and advise online strategy.

Unfortunately, this solution was expensive to scale. Competitor services relied on algorithms instead of human monitoring, and many at Ampush viewed automation as an inevitability.

This project focused on goal setting and design for the next generation of the in-house AMP software.
My Contributions
When I started, I classically mistook this to be a case of an internal product (AMP) for expert users (analysts). I interviewed our analysts, and quickly found that everyone was doing very different things for different clients.

In order to understand the role that AMP should fill moving forward, I needed to take a more holistic view. The real "product" was a combination of our analysts and AMP. The real "users" we needed to understand were client marketing teams.

I recruited the help of an analyst with research experience to interview some of our top clients.

Our findings allowed me to steer the AMP team away from the dream of pure automation, and ultimately focus the product roadmap (and of course, design) for AMP on helping our analysts be the trusted experts that advertisers want, rather than just a substitute for lack of an in-house solution.

Business Problem

As a managed service, Ampush must eventually face the question of how to scale its business. The status-quo relied heavily on manual tracking and optimization through AMP, whereas competitor services such as Nanigans went the algorithmic route. When I joined the AMP product team, I saw how product discussions were always affected by this undercurrent, toward the seemingly-inevitable automation and abstraction of analyst services.

However, I wondered—if it’s so inevitable, how are we still attracting these great clients? Is Ampush simply a stand-in until a self-serve solution becomes viable? What does this mean for the next generation of AMP?

My Approach

The product team had always worked closely with our analysts (direct users of AMP), but had never spoken with clients at length. This was ok for keeping the lights on in Ampush’s early days, but without basic knowledge of how client teams conducted business, we could not define long-term product direction. With my guidance, our team drafted, recruited for, and conducted our first set of client contextual interviews (most of which were on-site with the client).

One set of findings in particular was able to transform the core conversation on the future of AMP: many client teams want to spend more advertising dollars online but they are limited by their budget.

The key to unlocking a larger budget is not purely optimizing for ROI (though that is certainly important), but in the client team’s ability to present the case to their department heads. Larger clients already have many years of experience advertising on and understanding nuance in traditional media, so a few data reports from limited, unscientific tests run by a handful of analysts on platforms that are riddled with reporting and attribution problems are not exactly what the higher-ups are looking for. In short, they understand TV—where the money goes and how it will impact their bottom line. They haven’t achieved that with online advertising.

Results

This changed our entire team’s understanding of the working relationship between Ampush and our clients. Before, many believed our analysts to be necessary only in the absence of a sufficiently sophisticated algorithm. Now, they understood that an analyst’s true potential is to be the conduit of expertise that clients are looking for.

We could now evaluate product features on how they either help or hinder analysts from being perceived as expert/reliable.

Every section of our Director’s proposal for the next generation of AMP was rooted in the results of our client research—he saw how essential this perspective was, and how we could not build an experience from the reports of analysts alone.

The conversation around automation changed. We started talking in terms of how our product should support analysts, instead of how the product can do things for them. This was especially important to me because this moved us solidly into the realm of designing for building and retaining expertise.