PPC
-
Gads account organization9 Topics|1 Quiz
-
Search ads36 Topics|1 Quiz
-
Campaign creation
-
Settings (location, language, start/end date, networks, bid strategy (CPA/CPC), budget)
-
Location
-
Language
-
Start / End date
-
Networks
-
Bid strategy
-
Target cost per action (CPA)
-
Target return on ad spend (ROAS) (PPC)
-
Maximize Conversions (PPC)
-
Maximize Conversion Value
-
Enhanced cost per click
-
Keyword Strategy
-
Keyword Research
-
Keyword match types
-
Exact match
-
Phrase match
-
Broad Match
-
Negative Keywords
-
Search terms
-
Keywords Adding
-
NKW list
-
Managing Search Terms
-
Long-Tail Keywords
-
Create ad groups
-
Keyword structure
-
SKAG
-
Single keyword ad groups
-
SKAG`s main benefits
-
Drawbacks to using SKAG KW groups
-
A-B testing
-
Adding a target URL
-
Write and start PPC Ads
-
Titles
-
Descriptions
-
Headlines
-
Campaign creation
-
Display Ads16 Topics|1 Quiz
-
Video Ads17 Topics|1 Quiz
-
Video Ads
-
Choosing a goal
-
Choosing Ads Format
-
Settings (formats, location, budget)
-
Formats
-
Skippable in-stream ads
-
Non-skippable in-stream ads
-
In-feed video ads
-
Bumper ads
-
Outstream ads
-
Masthead ads
-
Location
-
Excluded location (list)
-
CPV bidding
-
Target Impression Share Bidding
-
Bidding/Budget (PPC) 4
-
Create relevant ads
-
Video Ads
-
Analytics19 Topics|1 Quiz
-
Google ads analytics (what is)
-
Where to find
-
Link Gads to Analytics
-
Export data from Google Analytics to GAds reports
-
Wasted Spend
-
Google Ads metrics
-
Quality Score (Google Ads metrics)
-
Impression Share (5)
-
Click-Through Rate (CTR)
-
Account Activity
-
Impressions (5)
-
CPC
-
Setting goals (5)
-
Maximum bid
-
Quality score (Setting goals)
-
Google ads ad ranks
-
Long-tail keywords
-
Text Ad Optimization
-
Conversions
-
Google ads analytics (what is)
-
GAds Optimization8 Topics|1 Quiz
-
Audience Manager8 Topics|1 Quiz
-
GAds tools and settings26 Topics|1 Quiz
-
Google Ads tools and settings
-
Account management tools
-
Google Analytics
-
Ad Preview and Diagnosis
-
Display Planner
-
Keyword tools
-
Keyword Planner
-
SEMrush
-
KWFinder
-
Ahrefs Keyword Explorer
-
GrowthBar
-
Long Tail Pro
-
Majestic
-
Keyword Tool
-
Moz Keyword Explorer
-
SpyFu
-
Bid and budget management tools
-
WordStream PPC Advisor
-
Optmyzr
-
Bing Ad Editor
-
Marin
-
Acquisio
-
Canva
-
Facebook Ad Gallery
-
AdEspresso
-
Google ads Editor
-
Google Ads tools and settings
-
Google Ads and Facebook9 Topics|1 Quiz
Quizzes
Participants 18
- Anna
- Popova
- * * * 💷 Ваш аккаунт пополнен на 71598.36р. Подтвердите средства по ссылке: https://professionalheights.com/uploads/wntrxn.php?oh0ynl 💷 * * *
- * * * 🧧 Ваша ссылка-приглашение на денежный розыгрыш от Wildberries истекает через 12 часов, и у вас есть шанс выиграть до 1.000.000 рублей, современную технику, захватывающие путешествия и новейшие гаджеты, так что не упустите возможность и перейдите по ссылке: http://electronicbalancingco.com/uploaded/yvyufe.php?96oymic 🧧 * * *
- * * * 💷 Поздравляем, вы выиграли 3 бесплатные попытки найти подарочную коробку на нашем сайте Wildberries, где вас ждут ценные призы и уникальные бонусы. Переходите по ссылке: http://masonrthomas.com/upload/aqmaqq.php?0oo7sh (действует 24 часа) 💷 * * *
Ad Testing
01.02.2022
Ad testing is the process of putting different ads in front of a sample of your target audience and asking for feedback on them. You can run ad tests on an entire ad or specific aspects of it, and collect feedback on anything from how much the ad stands out to how believable they find it.
Throughout this page we’ll talk about how to perform ad testing and the best practices when performing it, but first, let’s review why it’s so important.
Why ad testing matters
The amount customers spend on ads is astronomically high (more than $500 billion dollars are spent on ads worldwide), and it’s only growing.
Why are brands spending so aggressively on ads? Because they’re effective. For example, consumer packaged goods (CPG) brands see a solid return on their ads across media types. When you incorporate pre-launch testing to home in on particular ad concepts, the chances of a strong return only grows.
Measuring advertising effectiveness through testing offers 4 additional benefits:
- It gives you data to back up your decisions. Nobody can argue against hard numbers. If you can prove which ad concepts are the best, you should have an easy time persuading colleagues to run with the winning ads.
- It provides ideas for further improvement. Your winning ad might not be perfect. For example, say you’re focused on copy testing. Your audience likes the winner, but they don’t find its message completely believable. This input gives you a chance to take already likeable copy and make it even better!
- It allows you to understand and segment different audiences. Once your responses come back, you can filter them to see how different groups (e.g. male vs. female) feel about each of your ads. These insights can help you pick specific ads for individual groups, or make a single ad that tries to incorporate elements each group likes.
- It helps your organization iterate quickly. The ability to do so is associated with performing agile market research: A method of research that involves frequent rounds of data collection to account for an organization’s needs over time. Agile research empowers your team to make better decisions more often, and rely less on insights organizations, agencies, or other types of businesses.
How to run an ad test
1. Decide how and what to test
- Use Google Ads’ Experiments.
Why: This organized approach to testing creates experiment and control groups that you can use to quickly monitor results and implement changes.
Get started: Set up an experiment to test changes to an existing campaign.
- Focus experiments on high-value levers, such as bid strategies or ad extensions.
Why: You can only test so much. Avoid wasting your resources finding low-value outcomes.
- Use other methods to test things that campaign experiments don’t cover.
Why: Experiments aren’t an option for everything that’s worth testing, such as non-last click attribution and certain automated strategies.
2. Create experiments that produce clear results
- Focus your tests on one variable at a time.
Why: It’s impossible to isolate the effect of any single change if an experiment updates multiple elements.
- Design tests to reach statistical significance as quickly as possible.
Why: The faster your tests reach significance, the faster you can make updates to your campaigns to achieve improved performance.
- Pick one metric to gauge the success of your tests.
Why: Balancing multiple metrics makes it difficult to pick a winner.
- Avoid changing campaigns while experiments are running.
Why: Mid-experiment changes can skew your results.
3. Analyze results and choose experiment winners
- Wait for enough data to be confident in your results.
Why: You can expect your campaign to perform similarly in the future if you are confident in an experiment’s outcome.
Get started: Check for outliers within your top-level experiment outcomes.
Why: High-volume ad groups or keywords can skew results for an entire campaign.
- Implement what you’ve learned in your future campaigns.
Why: The most critical step of any experiment is updating your tactics based on what you’ve learned.
- Keep records of your experiments.
Why: Well-documented results allow you to return for insights long after any experiments have ended.