26 members, 12 questions, 1 buyer’s guide to mobile location data

Posted in Audience Series

Jonathan Lenaghan
By Caity Noonan

It’s not everyday that you join your competitors to give advice to media buyers who you’ll likely compete for down the road. But the Interactive Advertising Bureau’s (IAB) commitment to sharing best practices and thought leadership led to the creation of the IAB Location Data Working Group, 26 expert companies united to furthering the mobile advertising ecosystem. PlaceIQ welcomed the opportunity to participate and is excited to empower media buyers to better recognize high quality, accurate location data.

The result of this industry-wide collaboration was a 12-question guide that advertisers, agencies and marketers can reference when exploring location-based digital marketing and selecting a mobile ad provider. Each question digs deep into the components that dictate the speed, accuracy and ROI of mobile ad delivery within a campaign. The 12 questions are split evenly between two categories: “place data” and “device data.”

The place data questions in the Buyer’s Guide revolve around the source, precision, and verification process of fixed locations in the physical world – such as a baseball field. While the status quo across the ad tech space is to rely on basic geo-fencing and licensed map data to locate these places, we want to pinpoint locations, not just get in the vicinity. So our internal cartography team draws polygons by hand on real-world locations – such as your neighborhood grocery store. That way all data sets on our 100-meter by 100-meter grid, which spans the United States, are truly targeted.

The device data questions are similar in their investigation of: type of location data, filtration methods, and data accuracy, but different in that they refer to the location of a user’s devices – which may be at a baseball diamond now, and a grocery store two hours later. For this half of the questions we rely on the work of our data science team and an analytics pipeline that rigorously evaluates the quality of location data within ad requests.

Because we know that opt-in data is the highest quality location information available, it is the only type of location data that PlaceIQ utilizes. To achieve this, we take data filtration further and selectively remove bad data points that don’t reflect normal human behavior, are clearly computer generated, or are artifacts of location data infrastructure. For you visual learners, here’s what we’re talking about:

In this example  it doesn’t take a data scientist to recognize that these data points are unnaturally dispersed (left) and hyper clustered (right). These data points are far too organized to reflect the randomness of human movement patterns, and therefore must be removed.

We’re proud to have helped put together the IAB’s Mobile Location Data Buyer’s Guide, and to further the mobile ad tech ecosystem by educating current and future media buyers. While filtration and verification isn’t the sexiest part of what we do, not asking these questions could be detrimental to your mobile ad campaign.

To download the guide, please click here. And if you missed our panel at today’s IAB Mobile Road Show in Chicago, we’ll also be presenting at the NYC stop on the Road Show, on August 21.

iOS 8 Update: A Privacy Win for the Location Data Ecosystem

Posted in Audience Series, Blog, News

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle topics to give a 360-degree view on this vast, ever-changing industry.

Drew Breunig
By Drew Breunig

At their developer conference during the first week of June, Apple announced many changes coming in iOS 8. One of the changes regarded how a device’s MAC address is communicated to WiFi access points.

PlaceIQ doesn’t use MAC addresses to identify devices, and we welcome this change because it protects consumers and reduces potential privacy risks. As it currently stands, MAC address tracking via WiFi is not an opt-in experience and cannot be opted out from.

If you’d like to understand more, including how this works, whom it affects, and why PlaceIQ welcomes the change, read on. Fair warning: it might get a bit nerdy.

So What’s a MAC Address?

A MAC address is a hardware-based identification number, provided by any device that connects to a network. Hardware-based identifiers are read-only, meaning they can never be changed. They are written to the physical network chip in each device. When a device connects to a router or WiFi network, the device is identified by its MAC address for the duration of its connection. This allows the right traffic to be sent to and from your phone, PC, or TV regardless of how many devices are connected.

When your device is in your pocket or purse, it is regularly looking for a known WiFi network to join. This way, when you pull out your phone at home or at work it’s already connected and ready to go. But as your phone searches for WiFi, it is broadcasting its MAC address to WiFi access points within range. It’s part of the handshake devices engage in to recognize each other.

Recently, a few companies have developed WiFi hubs that remember the MAC addresses they see. They log your device as it scans for a hub, whether or not you join the WiFi access point. These companies have installed these logging WiFi hubs in many places, allowing them to compare visitors as they move from place to place, without their knowledge.

Even if people were informed that their devices were being monitored, the only way to prevent this type of tracking is to turn off WiFi completely. That’s a rather extreme step.

Finally, there’s a difficulty with hardware-based identifiers. The mobile advertising industry, including big players like Google and Apple, has worked hard to move away from hardware-based identifiers as much as possible. Software-based identifiers, like Apple’s IDFA, can be reset by users or blocked entirely. Hardware identifiers will not change for the life of the device. If there is a data leak and a malicious source obtains a hardware-based device identifier, the only way to ensure you will not be affected is to buy a new device.

A Privacy Challenge

So Apple was faced with a challenge: their users’ devices were being logged without their knowledge, without their consent, all while using a hardware-based identifier. Apple’s adherence to standard network practices – broadcasting MAC addresses to WiFi hubs – created an environment where this situation could occur. So Apple made moves to change that standard practice.

Starting in iOS 8, iPhones, iPads, and iPod Touches will broadcast random MAC addresses. In Apple’s words, “The MAC address for WiFi scans may not always be the device’s (universal) address.” Companies that log MAC addresses won’t be able to connect individual visits to a single device. They’ll know someone is there, but not where else they’ve gone.

Some have suggested that this move is a play to get more people using Apple’s own iBeacon API. This may be true. But iBeacons are much more user friendly. To see a company’s iBeacons, users must install an associated application and grant it the appropriate location permissions. Applications that use iBeacons are opt-in and users are always able to opt-out by managing their location permissions in their device settings.

The Right Move

iOS has a history of protecting user privacy and providing access controls. In fact, this isn’t their first big MAC address change. Last year they blocked applications from accessing the MAC address. And this wasn’t their only update to location privacy this year: with iOS 8, Apple is introducing much more explicit background location access controls.

Overall, I believe Apple’s decision to randomize MAC addresses is a win both for users and the location data ecosystem. They provide a managed space where developers can innovate without overstepping user expectations.

As a growing number of applications use location in more diverse ways than ever before, they can now do so in an environment where users still retain control.

Machines Won’t Take Your Ad Tech Job

Posted in Audience Series, Blog, PlaceIQ

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle topics to give a 360-degree view on this vast, ever-changing industry. The following article ran in AdExchanger’s Data-Driven Thinking series.

Jonathan Lenaghan
By Jonathan Lenaghan

Warnings of the coming Skynet-ization of digital advertising are becoming increasingly common. But rest assured, the near-term future of our industry is not going to be filled with self-aware, artificially intelligent machines that will replace all humans currently employed at ad tech companies.

However, digital advertising does seem to be on the cusp of a significant transformation, in the form of a rapid emergence of platform-centric ecosystems. The number of ad tech companies announcing the launch of a new platform seems to grow daily. These systems will be significantly more feature-rich than the real-time bidding or large Hadoop-based back-end platforms that have defined the industry for the past several years.

To unlock real value, though, these platforms need to enable business analysts, data scientists, campaign managers and an entire host of operations personnel. Many ad tech businesses rely on ingesting, processing and analyzing hundreds of terabytes of data coming from varied and disparate sources. Traditionally, large teams of engineers toting extensive experience within the Hadoop ecosystem were necessary to get actionable insights.

The next generation of platforms will still perform these functions, but aggregations, algorithms, internal languages and interactive visualization layers will empower this larger family of end users to better define and segment audiences, optimize campaigns based upon industry-specific KPIs or slice and pivot campaign data along many new dimensions. These platforms will no longer be under the exclusive purview of data teams but will be pushed deeper into organizations to those with perhaps less technical experience in big data but with deeper domain experience.

Checkmate

If the focus in the past decade has been to capture and process enormous amounts of data, the next step is to design platforms that strip away this complexity and scale and seamlessly incorporate the expertise of analysts and operations personnel. The emerging platform ecosystem will augment the intelligence of analysts and enable them to effortlessly make business decisions.

Chess is an excellent example of algorithms augmenting the intelligence of a human being. Ever since Deep Blue beat Gary Kasparov in 1997, computers and algorithms have been able to beat the best human grandmasters. The most formidable chess playing system, however, is a combination of top chess programs and human grandmasters. The sum of algorithms and human beings is greater than either individually.

Algorithmic Black Boxes

The term “platform” has a tendency to evoke images of the purely algorithmic black boxes that dominate the high-frequency equity-trading world. Similarly, bidding on ad inventory will always be algorithmic with little to no direct human interaction. Targeting and serving a digital ad needs to happen in a matter of milliseconds, and so it makes sense that on the surface much of the digital advertising ecosystem is loosely modeled after equity markets.

Algorithmic black boxes, however, are really only successful when they exploit time and capacity scales with a very narrow and specific purpose, such as optimally bidding on ad inventory with sub-millisecond latencies or taking advantage of tiny price discrepancies across multiple stock exchanges. The coming era where an analyst or data scientist is going to be replaced by a black box is a long ways off.

The R Project for Statistical Computing and other statistical packages, for example, have been around for many years but as a general population, we do not seem any better at understanding statistical concepts. Even companies that specialize in black box optimization utilize teams of analysts and data scientists to identify and implement optimization strategies. Those that do not rely on human intuition and experience, I conjecture, are doing a lot of optimization towards click and impression fraud.

The Power Of Deep Domain Expertise

To be sure, I am no Luddite. I have spent my career with machines, models and algorithms, and the greatest business leverage is found by combining the analyst with the algorithm. There is a common saying among data scientists in which bigger data beats better algorithms. I posit that deep domain understanding beats both.

Given the choice between doubling my data size, spending a few months investigating more sophisticated algorithms, or incorporating the work and expertise of a knowledgeable analyst into my platform, I’ll take the human being. Rules and heuristics defined by experts have more utility and can be implemented more quickly and efficiently that building fully automated systems that learn a domain.

This model has been very successful for companies like Palantir or Quid and is the core strength of the Consumer Insights Platform that PlaceIQ is developing. The Palantir platform works “at the intersection of data, technology and human expertise” to yield actionable results for governments, as well as businesses. Quid, likewise, has built a platform to ingest large amounts of unstructured data to provide analysts with a means of interrogating complex relationships. In these platforms, data and algorithms are used to leverage human experience and intuition.

At the end of the day, the role of domain expertise will tend to outweigh both the sophistication of methodologies and access to more data. The next “Rise of the Machines” will aid analysts and managers, rather than replace them.

Relative Consistency, or What Gödel Might Have to Say About ‘Big Data’

Posted in Audience Series, Blog

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle topics to give a 360-degree view on this vast, ever-changing industry.

Susan Zhang
By Susan Zhang

In the early 1900s, David Hilbert set out to prove the consistency of mathematics by reducing all mathematical statements into a formal language, from which we could deduce all mathematical statements.

Hilbert believe that derived statements would be consistent with one another. There would be no method of derivation in which we can obtain, from the same set of axioms, “1 + 1 = 2” in one case and “1 + 1 ≠ 2” in another.

One hundred years later, we sit on more data points about human behavior than ever. “Data-driven” is the go-to phrase for making decisions using statistical inference and complex computations. In digital marketing, utilizing these data points can help drive consumer outreach, illustrate trends in consumer behavior, and shed light on patterns that would have otherwise gone unnoticed.

The ways in which choose to “utilize” this data can vary tremendously. How then can we choose the “best” model?

In order to determine which method yields better results, some metric of measurement is needed from which error can be minimized. Unfortunately, these “true sets” or “true values” are not necessarily present or obvious.

Take, for example, the task of describing everyday human behaviors.

Do people who shop at one grocery store also frequent the nearby fast food chain? Do people with a higher income behave differently than the unemployed? In each case, the point of the investigation is to determine the “truth set” – what people are actually doing, how they should be classified, and what this classification implies about the state of the world. Sure, we can create our own target sets with pre-defined socio-economic biases, but then our algorithms would merely strive to confirm such biases within the entire population, not develop them independently from the raw data itself.

In 1931, Kurt Gödel published two incompleteness theorems establishing the impossibility of Hilbert’s claims. His second incompleteness theorem can be paraphrased:

Given a set of axioms and all statements derived from these axioms, there cannot exist a statement within this set that proves the consistency of this system. If such statement exists, then this system is inconsistent.

You can almost think of this like defining a word in the dictionary using the word itself: the self-referential nature negates the explanation.

The idea behind Gödel’s second incompleteness theorem closely mimics the limitations seen in the task of defining human behavior. We need some “truth set” to base an algorithm upon, but at the same time, any method used to obtain an audience’s true behavior, which simultaneously proves its own consistency, would be in violation of Gödel’s theorem.

While there may not be a method of deriving the absolute state of the world and knowing its degree of consistency, there is a way we can build ourselves up, layer by layer, using relative consistency.

Let’s start with describing people who own cars. Suppose we have a data set where 10% of the population consists of 18 to 23-year- olds. Our car ownership algorithm determines that 2% of all car owners are 18 to 23 years old.

This makes sense since young adults may be less capable of buying a car than older adults. The 2% number, when compared to the 10% number, appears to be accurate. But if the algorithm determined that 80% of all car owners are 18 to 23 years old, we would have a problem. The 80% number, when compared to the 10% number, does not appear to be anywhere near accurate.

In this case, the inconsistency in the results points to a potentially flawed algorithm or a corrupted input data set that is not representative of the true population. A check for the relative consistency of the results would tell us where a problem might exist, and prevent us from further iterations on a flawed algorithm and data set.

Like the processes of quality assurance in a manufacturing plant and ongoing maintenance for the structural base of a skyscraper, these consistency checks are fundamental to the iterative process of extracting meaning from big data. While we rely on complex algorithms to augment human intelligence and intuition, we must also question the integrity of the algorithms themselves to ensure that inconsistencies are rooted out as early as possible.

Gödel’s theorems may only be applicable in a particularly esoteric branch of mathematics, but they still illustrate a lesson that we can all benefit from: it is better to iterate with relative consistency than to settle for inconsistent systems.

What Goodhart’s Law Can Teach You About Performance Data

Posted in Audience Series, Blog

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle topics to give a 360-degree view on this vast, ever-changing industry. The following article ran in AdExchanger’s Data-Driven Thinking series.

Roman Shraga
By Roman Shraga

Is there a metric you use to evaluate the effectiveness of something critical to your company’s success? What about a metric used by your company to evaluate you?

If so, it is essential that you understand what could go wrong in the evaluation of performance data. Your job depends on it!

Performance data is the information that is used to assess the success of something. It’s how you evaluate the effectiveness of an ad campaign, the throughput of an engineering organization, or the business attributable to a specific salesperson, for example. Because performance data is directly tied to the key goals of both individuals and organizations, it is a sensitive – and even contentious – topic. It is ripe for obfuscation and abuse.

Goodhart’s Law

A critical insight into how to deal with performance data comes from Goodhart’s Law: “When a measure becomes a target, it ceases to be a good measure.” In other words, when the measure being used by decision-makers to evaluate performance is the same as the target being optimized by those being measured, it is no longer a reliable measure of performance.

The most cited example of this law in effect is the case of nail factories in the Soviet Union. The goal of central planners was to measure performance of the factories, so factory operators were given targets around the number of nails produced. To meet and exceed the targets, factory operators produced millions of tiny, useless nails. When targets were switched to the total weight of nails produced, operators instead produced several enormous, heavy and useless nails.

The above example is absurd, but illustrates the point: When a measure of performance is the same as the target, it can be abused to the point of no longer being useful in measuring the desired outcome.

Advertising Implications

This happens all the time in the modern world. For example, when CTR is both a measure and a target, ad companies have a perverse incentive to optimize for clicks with absolutely no regard for whom is doing the clicking. An ad campaign for Ferrari with CTR of 15% sounds amazing — unless the majority of people who clicked the ads are teenagers looking at pictures of cool cars.

Similarly, when cases closed is both the measure of performance and target of customer service organizations, employees might choose to close cases without fully investigating and resolving them. When page views are both the measure and the target of news sites and blogs, editors have incentives to post shocking and controversial content to optimize for the target. In the long run, of course, this behavior degrades the quality of the site and the page views measure is no longer a useful indicator of the desired outcome of an engaged user base.

Mitigation Techniques

Examples of Goodhart’s Law can be found in every industry and every department of an organization. Fortunately, there are several approaches that can be taken to mitigate its harmful effects.

  1. The first approach is also the most difficult. By thinking deeply about what is being measured and what the constraints are, it is possible to formulate better measurements. A body of knowledge known as the theory of constraints can be used to guide your thought process as you try to come up with a better measure.

    For example, as an alternative to relying on cases closed as a measure of customer service, a company can learn from Zappos and strive to quantify and reward good experiences as reported by customers. Still, it must be said that there is debate about whether it is even possible to find a single measure that is immune to the effects of Goodhart’s Law.

  2. A second approach could be to create a “balanced scorecard” of several different measures instead of relying on one. With this strategy, you reduce the risk of a single measure being gamed by looking at multiple measures that evaluate performance from different angles. For example, CTR can be supplemented with a measure of traffic quality, such as bounce rate or conversion rate.

    When you add multiple measures to your overall performance evaluation, you not only reduce the opportunity for abuse, but you begin to get a more nuanced understanding of the inherent tradeoffs being made. This is similar to the dual metrics of precision and recall used in machine learning classification problems. Together they measure how often the machine gets the right answer and what proportion of the total right answers the machine is able to get.

  3. A third way to mitigate the effects of Goodhart’s Law is to simply use human discretion. This means poking and prodding a reported performance measure until you develop a true understanding about what it is actually indicating. You need to ask questions that ensure the measure relate to the ultimate goal.

    Additionally, think about whether it would be possible to get a perfect score on the measure, and if it would be possible, to do so without adding any value. This line of reasoning will allow you dissect a measure until you understand whether or not it is doing a good job of indicating performance.

In the end, a mix of all three approaches to mitigation is the most judicious thing to do. You should strive to create the best possible measures that look at performance from multiple angles while always maintaining skepticism and inquiry.

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle topics to give a 360-degree view on this vast, ever-changing industry.

Dynamic Creative: The Perfect Gift for Every Campaign on Your List

Posted in Audience Series, Blog, Most Recent

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle a new topic to give a 360-degree view on this vast, ever-changing industry.

Young Lee
By Young Lee

Being in the midst of the holidays, consumers are hustling to find the perfect personalized gift during this shopping season. The same is also true for marketers and media buyers, where they strive to deliver the most relevant and personalized ad to their target audiences.

Using PlaceIQ’s technology, marketers are able to build sophisticated audiences based upon real-world behavior, and then leverage the context of time and location to deliver the most relevant ad; however, this is only half of the puzzle. The other half is the mobile creative. Many marketers execute campaigns with a “one creative fits all” approach across the targeting tactics of their mobile campaigns. However, it’s essential that one must understand that mobile is different from desktop because of the location-awareness of mobile devices. Delivering audience-aware mobile creative can have interesting results.

Audience-Aware Creative

When building out the mobile campaign creative, it is important to consider the audience, location, and time of day to give that personalized touch and resonate with the end user. Choosing the right creative is in itself a daunting and complicated task. Below are some suggestions for improving mobile creative, gleaned from what I’ve observed with working with marketers.

  1. Lower- Versus Upper-Funnel Tactics

    Targeting users in the lower funnel, like in-store or at the point of purchase, should have messaging related to features, price-points, and product descriptions. Users in the upper funnel should have more general brand messaging.

  2. Demographics

    Marketers should consider the demographics of their users. For example: Is the targeting tactic reaching price-conscious or affluent users? Affluent users are likely price-insensitive, so messaging should include features and benefits.

  3. Location

    Because the duo of location and time is a large indicator of intent, taking the combination into account for creative messaging is also important. Is your targeted audience interested in music? Have they been to a music venue recently? Were they there at night, perhaps when a concert could have taken place? This user will likely respond to messaging that includes a music theme.

Measurement

For any creative execution, it’s always important to have some sort of measurement to track the results. While click-through rate and interaction rates are the standard metrics of performance in display and mobile campaigns, PlaceIQ’s Place Visit Rate™ (PVR™) is a strong indicator of performance when it comes to branding campaigns because it can help marketers determine the true impact of their ad on in-store visits.

An example of an early success of utilizing “audience-aware” creative was a recent mobile campaign with an auto advertiser driving consumers to their dealerships. This campaign tested placements using generic creative against dynamic creative, in which the user received mobile ads that changed messaging and background depending on a user’s location. Initial results are showing positive signs, where users exposed to dynamic ads are about 2 times more likely to visit target destinations when compared to the other targeting tactics.

Choosing the right creative (or the right gift) is always a difficult task. There is never a silver bullet, and it will require multiple rounds of trial and error. However, learning about your target audiences and tailoring the creative based on this data can go a long way to connecting your brand to your audience.

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle a new topic to give a 360-degree view on this vast, ever-changing industry.

What We’ve Learned: Navigating an Event-Driven Mobile Campaign

Posted in Audience Series, Blog

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle a new topic to give a 360-degree view on this vast, ever-changing industry.

Joeseph Ranzenbach
By Joseph Ranzenbach

PlaceIQ CEO Duncan McCall has long said that “location is the biggest indicator of intent since search,” and both brands and advertisers have seemed to agree in kind. Mobile, the media format that enables the greatest degree of accuracy in identifying a user’s location, has witnessed a growth in advertising dollars that outpaces every other media format, and much of this spend is going to location-based targeting.

And why not? Not only are there more Android devices activated each day than there are babies born, but users are spending increasing amounts of time on their newfound electronic lifeblood, which is rarely more than three feet away from them. Add in granular location data on each impression, combine that with a sophisticated understanding of the space and time in which consumers travel, and you have an incredible platform for audience development, targeting, insight and attribution.

But in order to build and execute on a successful campaign, particularly for the impending phenomenon of holidays over the coming weeks, it is imperative to develop a well-planned, event-based advertising strategy.

The Challenges of Event-Driven Targeting in Mobile

Targeting a successful location-based advertising campaign on a day like Black Friday, Cyber Monday, or any of the hectic shopping days before Christmas can prove to create an enthralling set of problems to solve. In our experiences running event-driven mobile campaigns in the past, including a particularly relevant, large-scale campaign that targeted in-store customers on Black Friday 2012, we’ve learned to fine-tune our execution strategies to account for a few potential challenges:

  1. Supply Scarcity

    Whether you’re a big box retailer with 1,000 brick-and-mortar locations across the country or a regional consortium of auto dealers, it’s important to note that there is a limit on the market of available impressions out there for your in-store customers. (Khoa Pham wrote an excellent post on Fermi problems and ad targeting).

  2. Increased Competition

    Increases in consumer purchase intent on days like Black Friday are met with increases in advertising demand, meaning that the competition for consumer attention on limited available impressions is much higher than usual and, despite increased shopping activities, supply will not rise to meet demand in many cases.

  3. Scalable Infrastructure

    The increasing rush of ad demand to RTB, which has consistently outpaced research firm projections in recent years, has led to an increased supply in ad impressions and, consequently, an increased requirement for ad buyers to develop scalable infrastructure. The increase in consumer activity and competition for impressions brought on by holidays and large-scale events makes scalable infrastructure even more important. In order to capitalize on as many of those limited, relevant available impressions and beat out competition for them, it’s imperative to support seeing as much of the pool as possible. Unfortunately, due to the innate scarcity of addressing in-store audiences, the law of diminishing returns applies to supporting increasing pools of inventory and infrastructure.

  4. Atypical Behavioral Patterns

    Large-scale events effect consumer behavior and require algorithmic adaptation and iteration. What works on a typical Friday will not likely produce the same levels of accuracy or success on Black Friday or the Super Bowl when it comes to targeting, analytics, and performance (Rachit Srivastava wrote a great post on this topic).

Making Location Scale & Constructing Successful Campaigns

So given the known limitations, how do you make location scale to most effectively reach your audience and meet your campaign goals?

  1. Build a Strategic Audience Portfolio

    In building your campaign, it’s important to develop a targeting portfolio that not only helps you to achieve your goals, but also hedges your risk and offers opportunities for some home runs. For instance, while targeting consumers that are in your brick-and-mortar locations is an exceptional component to any campaign, it shouldn’t be your entire campaign, as it may limit your scale and reach. As supplemental audiences in your portfolio, why not try conquesting those who have visited a competitor’s location in the past and also targeting locations that consumers are likely to visit before reaching your stores?

    Additionally, it’s worth noting that growth in mobile commerce is outpacing both e-commerce and in-store sales. Why not build that into your campaign strategy as well? Given that not all consumers will be doing their shopping in-store this year, adding line items for in and out-of-home audiences with affinities for your products can help capture mobile or online commerce in addition to helping drive brick-and-mortar traffic.

  2. Utilize Data Intelligence, Not Data Sets

    There are a lot of flawed and incorrectly attributed location data sets out there and unfortunately a lot of folks are using the same ones. When dealing with large amounts of spend, especially over short periods of time, invest in a partner who invests their own time and resources in data aggregation, intelligence, quality, and analytics. There’s a reason why Apple’s initial mapping foray met so much criticism – location is a lot harder than it seems.

  3. Plan Ahead & Close the Feedback Loop

    At PlaceIQ, we’ve plotted billions of points of information against our patented location analysis platform to derive an intuitive, audience based understanding of the world around us. In preparation for large-scale, event-based campaigns, we reference previous campaigns and data sets with billions of data points and comparable conditions to adapt our expectations, rather than, say, a panel of a few thousand users (Extra Credit Reading: Why We Need to Do Better than Panels, Focus Groups and Surveys).

The holidays can be a very stressful time for many brands and advertisers, but they really don’t have to be. Significant scale and profitable results can be achieved by intelligently targeting your ad spend through strategically built audience compilations and being flexible with iteration. When it comes to event-driven targeting in mobile, thoughtful planning and leaning on data-driven insights can make the difference between celebrating an innovative and successful campaign and standing still while your competitor does.

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle a new topic to give a 360-degree view on this vast, ever-changing industry.

Black Friday: The Ultimate Test

Posted in Audience Series, Blog

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle a new topic to give a 360-degree view on this vast, ever-changing industry. The following article was featured as an AdExchanger “Data-Driven Thinking” column.

Rachit Srivastava
By Rachit Srivastava

Thanksgiving is around the corner, meaning great food, family and, of course, Black Friday.

Mobile ad targeting and strategy have advanced significantly this year, and mobile marketers are reaping the rewards of proven, solid algorithms and enjoying consistent uplift in success metrics. They should be feeling pretty comfortable with being able to target audiences efficiently on this crazy shopping weekend, right? Why would their algorithms fail them now?

Black Friday is like no other time of the year, and it may require marketers to step out of their comfort zones. This is no time for autopilot. If you want to pass the ultimate test, preparation is key.

Major Challenge 1: Infrastructure

On Black Friday, consumers are more inclined than ever to shop, and advertisers want to take extra advantage of this. This means that overall there will be more competition for targeted ad impressions served that weekend than on other days of the year. Advertisers will funnel an increasing number of ads towards these limited impressions in order to cash in on the intent of consumers and channel them toward stores.

To read the rest of this article, please visit Adexchanger.com.

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle a new topic to give a 360-degree view on this vast, ever-changing industry.

6 Reasons Why Data Science is an Art

Posted in Audience Series, Blog

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle a new topic each week to give a 360-degree view on this vast, ever-changing industry.

Juan Huerta
By Juan Huerta

If Michelangelo were alive today, what would he do for a living?

He would be a data scientist, of course.

Sure, crunching massive data sets on multi-thousand core clusters using algorithms that were once the exclusive domain of the scientific elites might not seem like an obvious career choice for the famous Caprese maestro, but I firmly believe that he would make quite the data scientist.

Here’s why: Behind the name, data science is a transformative craft, which shares a number of similarities with art.

www.freakingnews.com

www.freakingnews.com

  1. Transformative Synergism: In both disciplines, the final total is much greater than the sum of the inputs. In data science our prime raw material (data) goes through the process of transformation through cleansing, parsing, and normalization, followed by iterative optimizations where incremental value is distilled out of the raw material, and where a final result emerges in the form of new information. As point in case, imagine the workshop of Antonio Stradivari systematically transforming a generic block of wood into a unique piece of acoustic excellence.
  2. Technique and Apprenticeship: Art, like data, is based on mastery of technique. A painter takes pride on technical skill the same way a data scientist takes pride on the techniques he or she can bring to the table. And, in both cases, while fundamentals are learned through formal education, true technical mastery is only developed over periods of time and through hands-on experience. Data organizations are essentially workshops where technique is constantly emphasized, transferred, and codified in the form of best practices, intellectual and proprietary information, and techniques.
  3. Experimentalism and Innovation: As piano technology evolved during the 18th and 19th century, composers of the era quickly assimilated the innovations to the instrument and dutifully reflected these into their music. One can imagine the excitement of a young Beethoven when noticing the expressive range of the new pianos of the time. Data scientists have the same curiosity and adventurous spirit as we constantly assess every new technology that promises to produce improved outcomes and workflows.
  4. Creativity, Imagination and Hacker-spirit: Anybody who has worked on code, models, algorithms, or building data pipelines knows that in our line of work, inspiration and creativity are crucial. Sure, we still need the proverbial 99% perspiration, but if you haven’t got the inspiration, no amount of perspiration is going to get you out of your predicament. An online definition of “hacker” refers to a person “who enjoys the intellectual challenge of creatively overcoming or circumventing limitation.” Creativity is a strategic weapon in the hacker’s arsenal. One story tells us of Paganini occasionally breaking his violin strings during performances to demonstrate his virtuosity. Creatively overcoming these self-imposed limitations earns Paganini the honorary title of “hacker.”
  5. Specialization and Differentiation: Nobody disputes that Leonardo’s art is distinctive. Mozart never said “my music is awesome because it sounds just like Bach’s.” Nobody mixes their Dali’s with their Picasso’s. Every data organization strives to achieve a differentiation in technique and approach in order to produce characteristically distinctive results. Like artists, top data science organizations typically operate within their own niches excellence, reflecting the way we think of and approach a problem, our domain of expertise, as well as the resources we leverage and techniques we bring to the table.
  6. Persistence and Detail-orientation: Those of us who have seen architectural marvels like La Alhambra in Spain have inevitably wondered how many thousands, if not millions, of man-hours were spent creating such extensive and magnificent works. Likewise, data science is done through careful persistence and fastidious attention to detail. The true data scientist will understand that the difference between a tight model and a sloppy one is the belief that no detail is too small to matter and that in the end, this obsessive persistence is what makes all the difference.

We can go on listing many other similarities, but the ones mentioned should illustrate the main parallels between data science and art creation. I have no doubt a latter-day Michelangelo would definitely relinquish his chisel and mallet for some Hadoop and NoSQL.

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle a new topic each week to give a 360-degree view on this vast, ever-changing industry.

The Year of Mobile?

Posted in Audience Series, Blog

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle a new topic each week to give a 360-degree view on this vast, ever-changing industry.

Duncan McCall
By Duncan McCall

We’re all amusingly familiar with the seemingly annual ritual that takes place with various analysts proclaiming that the upcoming year will be the “Year of Mobile.”

This has always struck me as analogous to saying it’s the “year of the car” or the “year of the computer.”

It bugged me that no one really took this statement to task, so I gave a presentation recently titled “The Year of Mobile,” and when I started really thinking about this concept, it became quite an interesting exercise.

So what year could actually be considered the mythical “Year of Mobile” if you were to actually try and identify one?

Maybe it was 1983, when Motorola launched the first mobile phone: the DynaTAC (forever immortalized by Gordan Gekko). That had to be the year of mobile…right?

Slide1

Or perhaps 1989, when the Motorola StarTAC came out, and every sales rep worth their salt was flashing off their slick new clamshell mobile.

No? Well then what about 1993, when Nokia launched their first real GSM phone, complete with a calendar and the oh-so additive game of Snake.

If not then it had to be 1999. Blackberry took email on a phone mainstream with huge success – a watershed moment if there ever was one.

What about 2007? This had to be it. iPhone and Android ushered in the smartphone with what was to be the fastest consumer adoption of any consumer technology in history.

Slide6

Then of course it dawned on me: There was no “Year of Mobile.” We are simply living in the Age Of Mobile.

As we sit now with this so-called smart phone revolution well underway, we have these always-on, always-near, location-aware, connected devices that we check 40 times a day.

We’ve all seen first-hand how these devices have changed our daily lives.

They’ve changed the way we consume news and take and share photos. They’ve rendered paper maps almost obsolete. Ultimately, they have changed the way we actually behave, communicate and interact with one another.

They are also starting to change the way that brands and advertisers communicate with their customers and learn how their products are interacted with in the real world. They may even provide a platform to realizing the long, unfulfilled dream of one-to-one marketing.

Even more, we’re still in the very early days of this revolution. We’ve only just passed majority usage in the US, and with wearables, indoor location, health and fitness, mobile payments – all barely getting started, but along with many other mobile integrated technologies – poised to have a tremendously disruptive and transformative effect on our lives.

Analysts search for some magical metric in a short time horizon to identify that mobile has finally arrived. I think that ultimately we are in the middle of a technology phase that will be measured in decades and will continue to morph and change and defy simple labels or even accurate measurement.

So here’s to embracing the fact that we’re not in the “Year of Mobile.” We’re simply living in the Age of Mobile.
Slide22Y

PlaceIQ’s Audience Series sets out to highlight the importance of segments in the advertising world. As the pioneer of mobile’s application to location intelligence, and leaders in the mobile audience field, PlaceIQ has the knowledge you need. Key audience experts from each PIQ department — from engineering, to data science, to sales — will tackle a new topic each week to give a 360-degree view on this vast, ever-changing industry.