TEENMARK TECHNICAL GUIDE



2020 TEENMARK

TECHNICAL APPENDIX

MRI

200 Liberty St

New York, NY 10281

TEENMARK TECHNICAL GUIDE

The MRI TEEN Study

A. SAMPLE DESIGN

The 2020 MRI TEEN Study comprises data collected from a random sample of respondents, 12-19 years of age, residing in the 48 contiguous United States.

The source of the sample frame from the TEEN Study is MRI’s Doublebase 2020 households. While conducting the latter study, which used a strict area probability sample, MRI collected sex and age information about every person residing in each household. From March 1st, 2018 to June 2nd, 2019 (for Waves 79-80) and March 1st, 2019 to May 31st, 2020 (for Waves 81-82), MRI attempted to survey every teenager in those households, excepting those who were 2020 Doublebase respondents.

Because of the lag between the initial MRI interview for the adult syndicated study (the 2020 Doublebase) and the TEENMARK survey, MRI considered the amount of time elapsed between the initial collection of household information and the execution of the TEENMARK study to determine respondent eligibility. For example, children who were 11 years of age at the time of the initial screening a year earlier had become eligible respondents, but 19 year olds in the initial screening were no longer eligible for the Teen study. In order to ensure that only teenagers were surveyed, the sex and age of the designated teen respondents were matched to the information originally collected. Any respondents under 12 or over 19 years of age were eliminated from the study.

B. THE SURVEY QUESTIONNAIRE

A mailed questionnaire was used to collect data for the study, in addition to some in-person tests in which the interviewer personally provided the TEENMARK survey to eligible respondents within the household. The survey instrument contained questions pertaining to media exposure for magazines, radio, television and on-line services. Demographic and product usage data were also collected. Each questionnaire "packet" contained a $10.00 incentive for completing and returning the survey. In addition to the incentive, MRI held a random drawing for an Amazon gift card.

A second questionnaire was sent to all teenagers who had not responded to the first mailing within 30 days. No additional incentive was included in this mailing.

The completion rate for the study is shown below:

Originally mailed/placed 16,315

Undeliverable 705

Total Eligible 15,610 100%

Final In-Tab 1,955 12.52%

C. DATA COLLECTION AND QUESTIONNAIRE SCOPE

Ideally, the TEENMARK methodology would be identical to the methodology of the adult study. The Teen Study, however, was conducted primarily by mail, rather than by a combination of in-home, personal interviews and self-administered questionnaires; TEENMARK respondents were not, by design, the only persons surveyed in their households, while -- at the time -- the previously-surveyed adults were. Because teenagers represent insignificant proportions of the markets for many products and of the audiences of many media; e.g., alcoholic beverages and television news programs, product category and media questions were appropriately abbreviated. Finally, some product categories and media whose markets and audiences are primarily teenage oriented were included in TEENMARK research but not in the study of adult consumers.

1. MAGAZINE READERSHIP MEASUREMENT AND COMPUTATION

The procedure employed in the teen study was based on two straightforward questions. Each respondent was asked:

(a) About how many issues of an average four do you read or look into?; and

(b) Have you read or looked into any copy in the (last publication interval)? The publication interval obviously conformed to the publication frequency of the listed magazines.

The list of magazines was rotated both alphabetically and by publication cycle so that four versions of the magazine questioning procedure were created.

The TEENMARK magazine readership questions and audience calculation were different from those employed in the adult study. The divergent approaches were dictated by the data collection methods used.

In the adult study respondents are asked by interviewers to sort a deck of approximately 200 cards bearing magazine logos into three categories: read or looked into the past six months (yes - sure have), might have done so (not sure), and did not read or look into during the past six months (no - sure have not). This "screen" or "filter" question reduces the number of magazines about which subsequent questions are asked. Following the sorting procedure the surviving (read or might have read) magazines are once again presented, and respondents are asked how many issues of the average four are read or looked into (frequency) and are then asked whether each magazine was read or looked into during its most recent publication interval (recency) -- seven days for weekly magazines, 30 days for monthly magazines, and so on.

The TEENMARK research, because it was conducted by mail, did not use the screening procedure described above. By omitting the screen in a mailed survey, MRI abided by the generally-held belief that such a procedure should be avoided. Michael Brown stated this admonition best for readership surveys. "One of these (readership surveys) involved a self-completion questionnaire and with this technique, an "invitation" to skip certain subsequent questions - as implied by the filter is usually avoided".

Instead, respondents were first asked to indicate the frequency of their readership of the average four issues, then to indicate whether they had read or looked into the magazine during its most recent publication interval.

Differences in readership results were expected and indeed were found in MRI's pretesting. The explanations for these differences could be due to these factors:

a. During the personal interview in the adult study, recency questions are asked with reference to a specific previous date or day corresponding to the number of days earlier than the interview the publication interval occurred. This was not possible with the mail survey and may have lessened the accuracy with which respondents were able to answer. TEENMARK respondents were readily able, while answering the frequency question, to preview the following question set -- the recent reading questions -- and were also able, while responding to the recent reading questions, to refer back to their answers to the frequency question. This would have allowed respondents to attempt to achieve perceived consistency between these question sets, if they were motivated to do so.

b. In the adult study, after respondents "screen-in" by indicating readership of a magazine during the past six months, they are asked whether their readership of the average four issues is 0, 1, 2, 3, or 4. In the Teen Study no screening question was asked and the zero frequency response cannot be distinguished from a non-reader.

Since a major objective of the TEENMARK research is to provide readership data which may be combined with readership data from the adult study, two measures were employed to determine the extent of the differences in magazine readership, and to ensure results would be reported compatibly:

First, the inclusion of 18 and 19 year olds in the TEENMARK sample provided a direct means to compare results of the two studies, since respondents aged 18 and 19 are also included in the adult study.

Second, MRI has years of historical data on the relationship between the reading frequency and recency; the TEENMARK data were examined for consistency with these adult data relationships.

To make the TEENMARK magazine readership data compatible to the adult magazine readership data, the following procedure was undertaken: the frequency and the recency questions were asked almost identically in both studies, yet, due to the methodological differences described above, produced different results; specifically the relationship between frequency and recency data was different for the teen and the adult studies. To render the TEENMARK readership data equivalent with adult data, six years of each magazine's historical frequency/recency relationship was calculated from the adult study (the three most recent Doublebases). Within each frequency -- 1, 2, 3, or 4 of the average four issues -- the recency responses from the Teen Study were conformed to their historical relationships with frequency for each magazine. The data were conformed separately for each of three age groups -- 12-14, 15-17, 18-19 -- to maintain response consistency within age cohorts.

The recency data for magazines not measured in the adult study were conformed to an averaged, historical frequency-recency relationship in the Teen study.

MRI conformed the audiences by adding a systematic Nth selection of screeners, using weighted data, who did not claim readership in the recency question if the read-to-screen ratios in the Teen study were lower than found in the National Adult study (The Survey of the American Consumers). If the Teen read-to-screen ratios were higher than in the National Study, claimed readers were systematically removed (using weighted data) using an every Nth procedure to meet the targeted read-to-screen found in the National Adult study. MRI employed this modeling procedure for Teen respondents in all interviewing waves prior to the second half of respondents for TEENMARK 2020.

Beginning with the second half of respondents for TEENMARK 2020, MRI began employing the read-to-screen probabilities directly from those found in the National Adult study to each magazine’s weighted frequency estimates. We made this change to be completely consistent with the new probability approach employed in the National Study. However, we no longer used claimed readership in the modeling.

The resultant adjusted data correlate closely with the readership by 18 and 19 year olds of magazines also measured in the adult research. This indicates that the restoration of the historical frequency-recency relationship overcomes to a great degree the differential results which arose from the employment of the separate data collection techniques.

2. PRINT WEBSITES AND DIGITAL READING

Respondents are asked if they have visited a magazine or newspaper website in the last 6 months, followed by questions about which specific devices (laptop or desktop computer, e-reader (brand(s) specified), tablet (brand(s) specified), cell phone, smartphone or other mobile device), if any, have been used to read or look into a magazine and/or a newspaper in the last 6 months.

3. COMIC READING

For each listed comic title, respondents are asked the number of issues of an average four they read or looked into, followed by a question about whether they have read or looked into any copy in the last 30 days.

4. CELL PHONES & SMARTPHONES

• Cell or Smartphone ownership

• Specific brand(s) of cell or smartphone owned

• Activities you have done using a cell or smartphone, last 30 days

• Types of Apps you personally used last 30 days

• Apps/Activities you would miss if unavailable

• Number of Apps downloaded for free, last 30 days

• Number of Apps purchased, last 30 days

• Amount spent on Apps, last 30 days

• Types of communication features used last 30 days

• Types of media features used last 30 days

• Types of other features used last 30 days

• Who chose your cell or smartphone service?

• Who chose your cell or smartphone?

• Who pays your cell or smartphone bill?

5. E-READERS AND TABLETS

• Specific brand(s) of tablets and e-readers owned.

• Used specific brand(s) of tablets and e-readers (your own or someone else’s), last 30 days

• In the last 7 days, how much time, if any, did you spend using a tablet and/or e-reader?

• In a typical week during the school year, where do you use a tablet and/or e-reader?

• Activities you have done using a tablet or e-reader, last 30 days

• Types of Apps you personally used last 30 days

• Apps/Activities you would miss if unavailable

• Number of Apps downloaded for free, last 30 days

• Number of Apps purchased, last 30 days

• Amount spent on Apps, last 30 days

6. INTERNET AND ONLINE SERVICES

The questionnaire contained nine questions relating to Internet and online service usage. The questions were worded as follows:

• In the last 30 days, have you accessed the Internet (using a computer or any other device) in any of the following places?

• On average, how much time do you spend using the Internet on a typical weekday (i.e. Monday-Friday), not including time spent on email or IM?

• On average, how much time do you spend using the Internet on a typical weekend day (i.e. Saturday or Sunday), not including time spent on email or IM?

• In the last 30 days have you connected to the Internet using a computer (such as a desktop or laptop) with Wi-Fi or another wireless connection at a location outside of your home, such as a park or coffee shop?

• In the last 30 days, which websites or search engines, if any, did you use to find other websites or information?

• What type of messaging, chat and/or video chat services do you use?

• Which, if any, of the following activities did you do on the Internet in the last 30 days?

• Which, if any, of these social media, photo or video-sharing sites did you visit or use in the last 30 days?

• Do your parents let you go anywhere you want on the Internet?

7. WEBSITES AND APPS

• Which, if any, of the following websites or apps did you visit or use in the last 30 days?

• Email address used

8. RADIO/AUDIO LISTENING

The radio listening question for weekday listening asks:

"To the nearest hour, approximately how much time do you usually spend listening to or hearing a radio or other audio services on a typical weekday? Include all listening, whether in your home, car or any other place. Include listening on a radio, computer, tablet, cell phone or any other device. Do not include listening to your personal music collection, such as CDs or audio you have purchased or downloaded.”

The respondent is asked about five separate time periods. For each time period mentioned, the respondent was asked to write in the call letters of the stations usually listened to or heard and to indicate whether the station was an AM or an FM station, the Internet/App or SiriusXM.

The same questioning sequence was applied to “a typical weekend day, Saturday/Sunday.”

The respondent was also asked to indicate where he/she listened to the radio or audio services most often on a typical weekday and a typical weekend day.

9. RADIO DISNEY

Respondents are asked if they have listened to Radio Disney in the last week (last 7 days).

10. MUSIC AND AUDIO SERVICES

• Music or audio services listened to or used, last 30 days

• Music or audio-related activities done using the Internet, last 30 days

• Have you listened to a podcast in the last 30 days?

11. SMART SPEAKERS

• Specific brand(s) or voice activated smart speakers you or anyone in your household owns

• Activities personally done using a smart speaker, last 30 days

12. TELEVISION VIEWING

The questionnaire contained two sets of questions for television viewing. The first set asked about usual viewing on an "average weekday" followed by on an “average weekend day”. The question was worded as follows:

"Please write in approximately how much time you usually spend watching television on an average weekday, Monday through Friday, for each of the time periods shown."

Time periods varied by time zone in which the teenager resided. Similar questions were asked for Saturday/Sunday viewing, but the time periods were fewer than for the average weekday.

Respondents were also asked specific TV channels and networks with the following question:

"About how many hours have you watched each of the following services in the past 7 days?"

In addition to these questions, the survey measured teen viewing of specific programs. The recency and frequency of viewing, along with attention levels and place of viewing, were asked for daily (Monday through Friday) programs, weekly programs, weekend programs, seasonal sports, and annual sports events/specials. In the 2020 Study, respondents were asked the likelihood of watching the Olympics.

13. PREMIUM CHANNELS

Respondents were asked how many hours they watched specific premium channels in the past 7 days.

14. STREAMING VIDEO

Respondents were asked which, if any, specific streaming services they used in the last 30 days to watch TV shows, movies or other programming.

15. WATCHING TELEVISION

• Activities done when TV commercials come on

• Activities done while watching TV

• Who do you watch TV with?

• Favorite device to watch TV on

16. VIDEO-ON-DEMAND AND PAY-PER-VIEW VIEWING

• In the past 12 months, have you watched any programs, movies, or events on Pay-Per-View?

o Number of times watched types of programs, past 12 months on Pay-Per-View

• In the past 12 months, have you watched any programs, movies, or events on Video-On-Demand?

o Number of times watched types of programs, past 12 months on Video-On-Demand

17. SOCIAL MEDIA

Respondents were asked activities done using a social media, photo or video-sharing service (such as Instagram, Snapchat or YouTube) in the last 30 days.

18. MORE ABOUT SOCIAL MEDIA

Respondents were asked type(s) of videos watched using social media services such as YouTube or Facebook Watch.

19. VIDEO GAMES (Personally played)

Respondents were asked specific systems, specific game types, accessories, and video game titles played in the last 30 days and hours played last 7 days.

Respondents were also asked the following:

• Forms of video games played

• Online Gaming Services used last 30 days (e.g. Nintendo Network, Nintendo Switch Online, etc.)

• Types of content downloaded or streamed through services listed

• Other activities done through services listed

• Did you watch or play in an eSports event (video game tournament), last 30 days?

• In the last 30 days, have you played a multiplayer game online?

• In the last 30 days, have you played a MMO (Massive Multi-player) game online?

• Total amount spent, last 12 months, on video games, on video game systems (hardware)

• Number of video games, purchased in the last 12 months and rented in the last 30 days

• Where purchased Video Games

• How do you find out about new video games?

• Which if any, of the following things do you do after you play a new video game?

• Who do you typically play video games with?

• Which, if any, of these activities have you done in the last 30 days?

20. PSYCHOGRAPHICS

The questionnaire contained twenty-four categories relating to the teenager’s interests and attitudes. The topics that were released this year are as follows:

|Ads |Media |

|Alternative Advertising |Memberships/Clubs |

|Beauty: Hair |Music |

|Beauty: Make-up |Personalities |

|Brand Loyalty |Purchasing Influences |

|Cellular/Mobile |Stresses |

|Fashion & Health |Socializing |

|Finance |Technology |

|Food |Volunteerism |

|Future Goals |Yourself |

|Internet/Online | |

|Interest in Sports | |

|TV Watching | |

|Social Media | |

21. DEMOGRAPHIC AND RELATED DATA

A limited number of demographic questions were asked of the teenagers. In addition to these, a number of responses by the adult in the same household regarding household data were carried over to the teen survey.

22. PRODUCT DATA

The collection of information on product use and consumption followed the demographic and media sections of the questionnaire. To be consistent with the adult study, questions common to the adult and teen study were asked in exactly the same format, including timeframe. Additional questions, particularly relevant to the teen market, were added to the product sections. As a result, not all of the product data can be merged with the adult study.

D. DATA PROCESSING

All data processing procedures are consistent with those employed by MRI for the adult study with one notable exception: because all data are collected by a single questionnaire, there is no data ascription. For a detailed description of other data processing procedures, please refer to the MRI Technical Guide for the Adult Study.

E. SAMPLE BALANCING/PROJECTION

The TEENMARK study is weighted and sample balanced to conform to estimates provided by the U.S. Bureau of the Census and Claritas.

The first stage, weighting, accounts for the differential probabilities of household selection for the study. The factors affecting these probability rates and the applied weights are discussed in detail in the Technical Guide for the Adult Study.

After weighting, the sample is balanced within sex on the following population parameters:

(1) Top Ten Markets

(2) Age

(3) Race

(4) Metropolitan and Micropolitan/Unassigned CBSA within Census Region

(5) Household Income

(6) Education

(7) County Size

(8) Hispanic Origin

The sample balancing procedure is the widely used technique first discussed by W. Edwards Deming in his book "Statistical Adjustment of Data". After sample balancing, the distribution of weights is inspected and the "outlier" weights are trimmed to reduce the sampling tolerance levels.

The tables below compare the before and after balancing estimates.

| |MALE TEENS |FEMALE TEENS |

| |% |% |

| |BEFORE |AFTER |BEFORE |AFTER |

| | | | | |

|Total |100.0 |100.0 |100.0 |100.0 |

| | | | | |

|Age | | | | |

|12-14 |43.4 |37.6 |45.4 |37.2 |

| | | | | |

|15-17 |41.7 |39.1 |39.1 |38.9 |

| | | | | |

|18-19 |14.9 |23.4 |15.5 |23.9 |

| | | | | |

|Race | | | | |

|WHITE Only |60.7 |63.9 |63.7 |63.7 |

| | | | | |

|BLACK / African-American Only |12.5 |15.3 |12.2 |15.4 |

| | | | | |

|OTHER Race/ Multiple Classifications |26.8 |20.8 |24.1 |20.9 |

| | | | | |

|CBSA | | | | |

|NE Metropolitan |16.3 |15.8 |20.8 |16.0 |

|CBSA | | | | |

| | | | | |

|NE Micropolitan |0.4 |0.7 |0.3 |0.6 |

|CBSA/Unassigned | | | | |

| | | | | |

|NC Metropolitan |15.6 |16.7 |16.6 |16.7 |

|CBSA | | | | |

| | | | | |

|NC Micropolitan |3.6 |4.7 |2.4 |4.6 |

|CBSA/Unassigned | | | | |

| | | | | |

|S Metropolitan |35.5 |34.0 |33.1 |32.7 |

|CBSA | | | | |

| |MALE TEENS |FEMALE TEENS |

| |% |% |

| | | | | |

| |BEFORE |AFTER |BEFORE |AFTER |

|TOTAL |100.0 |100.0 |100.0 |100.0 |

| | | | | |

|S Micropolitan |2.0 |4.6 |3.7 |6.0 |

|CBSA/Unassigned | | | | |

| | | | | |

|W Metropolitan |26.1 |22.4 |22.0 |21.6 |

|CBSA | | | | |

| | | | | |

|W Micropolitan |0.5 |1.1 |1.2 |1.9 |

|CBSA/Unassigned | | | | |

| | | | | |

|HHI | | | | |

| | | | | |

|LESS THAN 20K |5.6 |10.2 |6.4 |8.7 |

|20-29K |3.7 |5.3 |6.2 |6.4 |

|30-39K |4.7 |6.0 |5.6 |8.1 |

|40-49K |8.5 |7.8 |8.5 |7.1 |

|50-74K |17.5 |16.7 |13.1 |16.0 |

|75-99K |14.7 |13.5 |12.2 |13.8 |

|100K+ |45.3 |40.6 |48.0 |40.0 |

| | | | | |

|IN SCHOOL |95.3 |91.7 |95.2 |91.0 |

|NOT IN SCHOOL |4.7 |8.3 |4.8 |9.0 |

| | | | | |

|COUNTY SIZE | | | | |

|A |58.2 |41.2 |59.3 |41.3 |

|B |20.2 |30.5 |23.6 |30.6 |

|C/D |21.6 |28.3 |17.1 |28.1 |

F. SAMPLING TOLERANCES

All sample surveys are characterized by sampling tolerances. Sampling tolerance is the difference that can be expected between the results of a sample survey and the results of a full survey or census, using the same procedures and techniques. This is the difference due to the chance selection of one group of respondents or another. For a more extensive discussion of the differences in sampling tolerance levels, please refer to the Technical Guide for the Adult Study.

The calculated sampling tolerances for the media audiences are tabulated at the beginning of the audience volume. These are computed using a set of eight replicated subsamples of the total sample. The differences among the eight subsamples, which are chance differences, are used to estimate the sampling tolerances of the total sample.

The sample tolerances should be used to evaluate the precision of an estimate and the degree of confidence that can be placed in it.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download