• Search Menu
  • Sign in through your institution
  • Advance articles
  • Author Guidelines
  • Submission Site
  • Open Access
  • Why Submit?
  • About Journal of Survey Statistics and Methodology
  • About the American Association for Public Opinion Research
  • About the American Statistical Association
  • Editorial Board
  • Advertising and Corporate Services
  • Journals Career Network
  • Self-Archiving Policy
  • Dispatch Dates
  • Journals on Oxford Academic
  • Books on Oxford Academic

High-Impact Articles

Journal of Survey Statistics and Methodology , sponsored by the American Association for Public Opinion Research and the American Statistical Association , began publishing in 2013. Its objective is to publish cutting edge scholarly articles on statistical and methodological issues for sample surveys, censuses, administrative record systems, and other related data.

OUP has granted free access to the articles on this page, which represent some of the most cited, most read, and most discussed articles from recent years. These articles are just a sample of the impressive body of research from Journal of Survey Statistics and Methodology .

Most Downloaded

Most discussed.

Article altmetric score

  • Recommend to your Library

Affiliations

  • Online ISSN 2325-0992
  • Copyright © 2024 American Association for Public Opinion Research
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Articles on Surveys

Displaying 1 - 20 of 114 articles.

survey and latest research articles

The number of religious ‘nones’ has soared, but not the number of atheists – and as social scientists, we wanted to know why

Christopher P. Scheitle , West Virginia University and Katie Corcoran , West Virginia University

survey and latest research articles

Gen Zers and millennials are still big fans of books – even if they don’t call themselves ‘readers’

Kathi Inman Berens , Portland State University and Rachel Noorda , Portland State University

survey and latest research articles

Honey bees are suprisingly abundant, research shows – but most are wild, not managed in hives

Francis Ratnieks , University of Sussex and Oliver Visick , University of Sussex

survey and latest research articles

Gen Z and millennials have an unlikely love affair with their local libraries

survey and latest research articles

US food insecurity surveys aren’t getting accurate data regarding Latino families

Cassandra M. Johnson , Texas State University ; Amanda C. McClain , San Diego State University , and Katherine Dickin , Cornell University

survey and latest research articles

What are young Australians most worried about? Finding affordable housing, they told us

Lucas Walsh , Monash University ; Blake Cutler , Monash University ; Thuc Bao Huynh , Monash University , and Zihong Deng , Monash University

survey and latest research articles

Potentially faulty data spotted in surveys of drug use and other behaviors among LGBQ youth

Joseph Cimpian , New York University

survey and latest research articles

Most Americans support NASA – but don’t think it should prioritize sending people to space

Mariel Borowitz , Georgia Institute of Technology and Teasel Muir-Harmony , Georgetown University

survey and latest research articles

Americans remain hopeful about democracy despite fears of its demise – and are acting on that hope

Ray Block Jr , Penn State ; Andrene Wright , Penn State , and Mia Angelica Powell , Penn State

survey and latest research articles

US birth rates are at record lows – even though the number of kids most Americans say they want has held steady

Sarah Hayford , The Ohio State University and Karen Benjamin Guzzo , University of North Carolina at Chapel Hill

survey and latest research articles

LGBTQ Americans are 9 times more likely to be victimized by a hate crime

Andrew Ryan Flores , American University ; Ilan Meyer , University of California, Los Angeles , and Rebecca Stotzer , University of Hawaii

survey and latest research articles

Mussels are disappearing from the Thames and growing smaller – and it’s partly because the river is cleaner

Isobel Ollard , University of Cambridge

survey and latest research articles

What psychology tells us about the failure of the emergency services at the Manchester Arena bombing

Nicola Power , Lancaster University

survey and latest research articles

Who sees what you flush? Wastewater surveillance for public health is on the rise, but a new survey reveals many US adults are still unaware

Rochelle H. Holm , University of Louisville

survey and latest research articles

Eating lots of meat is bad for the environment – but we don’t know enough about how consumption is changing

Kerry Smith , University of Reading and Emma Garnett , University of Oxford

survey and latest research articles

We asked Ukrainians living on the front lines what was an acceptable peace – here’s what they told us

Gerard Toal , Virginia Tech and Karina Korostelina , George Mason University

survey and latest research articles

More than 1 in 5 US adults don’t want children

Zachary P. Neal , Michigan State University and Jennifer Watling Neal , Michigan State University

survey and latest research articles

A window into the number of trans teens living in America

Jody L. Herman , University of California, Los Angeles ; Andrew Ryan Flores , American University , and Kathryn K. O’Neill , University of California, Los Angeles

survey and latest research articles

Are Australians socially inclusive? 5 things we learned after surveying 11,000 people for half a decade

Kun Zhao , Monash University and Liam Smith , Monash University

survey and latest research articles

Climate change, the environment and the cost of living top the #SetTheAgenda poll

Misha Ketchell , The Conversation

Related Topics

  • Climate change
  • Coronavirus
  • Public opinion
  • Quick reads
  • Significant Figures
  • The Conversation France

Top contributors

survey and latest research articles

Professor of Psychology, San Diego State University

survey and latest research articles

Professeur au département Gestion, Droit et Finance, Grenoble École de Management (GEM)

survey and latest research articles

Associate Professor of Political Science, University of Richmond

survey and latest research articles

Professor, Future Fellow and Head of Statistics at UNSW, and a Deputy Director of the Australian Centre of Excellence in Mathematical and Statistical Frontiers (ACEMS), UNSW Sydney

survey and latest research articles

Research fellow, BehaviourWorks Australia, Monash Sustainable Development Institute, Monash University

survey and latest research articles

Director, BehaviourWorks, Monash Sustainable Development Institute, Monash University

survey and latest research articles

Emeritus professor, Monash University

survey and latest research articles

Professor in Economics and Humanistic Studies, Princeton University

survey and latest research articles

Professor of Public Policy and Director of the Policy Institute, King's College London

survey and latest research articles

Professor of Finance, Duke University

survey and latest research articles

Visiting Scholar at the Williams Institute and Assistant Professor of Government, American University

survey and latest research articles

Associate Professor of Book Publishing and Digital Humanities, Portland State University

survey and latest research articles

Senior Lecturer, The University of Queensland

survey and latest research articles

Associate Professor of Publishing, Portland State University

survey and latest research articles

Assistant Professor of Pediatrics, West Virginia University

  • X (Twitter)
  • Unfollow topic Follow topic

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals

Statistics articles from across Nature Portfolio

Statistics is the application of mathematical concepts to understanding and analysing large collections of data. A central tenet of statistics is to describe the variations in a data set or population using probability distributions. This analysis aids understanding of what underlies these variations and enables predictions of future changes.

Latest Research and Reviews

survey and latest research articles

Grasshopper platform-assisted design optimization of fujian rural earthen buildings considering low-carbon emissions reduction

survey and latest research articles

The METRIC-framework for assessing data quality for trustworthy AI in medicine: a systematic review

  • Daniel Schwabe
  • Katinka Becker
  • Tobias Schaeffter

survey and latest research articles

Effects of dietary fish to rapeseed oil ratio on steatosis symptoms in Atlantic salmon ( Salmo salar L) of different sizes

  • D. Siciliani
  • Å. Krogdahl

survey and latest research articles

A model-free and distribution-free multi-omics integration approach for detecting novel lung adenocarcinoma genes

  • Shaofei Zhao

survey and latest research articles

Intrinsic dimension as a multi-scale summary statistics in network modeling

  • Iuri Macocco
  • Antonietta Mira
  • Alessandro Laio

survey and latest research articles

A new possibilistic-based clustering method for probability density functions and its application to detecting abnormal elements

  • Hung Tran-Nam
  • Thao Nguyen-Trang
  • Ha Che-Ngoc

Advertisement

News and Comment

survey and latest research articles

Efficient learning of many-body systems

The Hamiltonian describing a quantum many-body system can be learned using measurements in thermal equilibrium. Now, a learning algorithm applicable to many natural systems has been found that requires exponentially fewer measurements than existing methods.

survey and latest research articles

Fudging the volcano-plot without dredging the data

Selecting omic biomarkers using both their effect size and their differential status significance ( i.e. , selecting the “volcano-plot outer spray”) has long been equally biologically relevant and statistically troublesome. However, recent proposals are paving the way to resolving this dilemma.

  • Thomas Burger

survey and latest research articles

Disentangling truth from bias in naturally occurring data

A technique that leverages duplicate records in crowdsourcing data could help to mitigate the effects of biases in research and services that are dependent on government records.

  • Daniel T. O’Brien

survey and latest research articles

Sciama’s argument on life in a random universe and distinguishing apples from oranges

Dennis Sciama has argued that the existence of life depends on many quantities—the fundamental constants—so in a random universe life should be highly unlikely. However, without full knowledge of these constants, his argument implies a universe that could appear to be ‘intelligently designed’.

  • Zhi-Wei Wang
  • Samuel L. Braunstein

survey and latest research articles

A method for generating constrained surrogate power laws

A paper in Physical Review X presents a method for numerically generating data sequences that are as likely to be observed under a power law as a given observed dataset.

  • Zoe Budrikis

survey and latest research articles

Connected climate tipping elements

Tipping elements are regions that are vulnerable to climate change and capable of sudden drastic changes. Now research establishes long-distance linkages between tipping elements, with the network analysis offering insights into their interactions on a global scale.

  • Valerie N. Livina

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

survey and latest research articles

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications

Our Methods

  • Short Reads
  • Tools & Resources

Read Our Research On:

Views on America’s global role diverge widely by age and party

War in ukraine: wide partisan differences on u.s. responsibility and support, asian americans: a survey data snapshot.

Asian Americans are the fastest-growing major racial or ethnic group in the country. Here’s how they describe their own identities, their views of the U.S. and their ancestral homelands, their political and religious affiliations, and more.

  • Learn more about: Chinese Americans | Filipino Americans | Indian Americans | Japanese Americans | Korean Americans | Vietnamese Americans

How Americans Get Local Political News

Experiences of u.s. adults who don’t have children, sign up for our weekly newsletter.

Fresh data delivered Saturday mornings

Latest Publications

How do states fill vacancies in the u.s. senate it depends on the state.

In the event that a Senate seat becomes vacant, governors in 45 states have the power to appoint a temporary replacement.

Here’s how Asian Americans describe their own identities, their views of the U.S. and their ancestral homelands, their political and religious affiliations, and more.

Majority of Americans support more nuclear power in the country

Americans remain more likely to favor expanding solar power (78%) and wind power (72%) than nuclear power (56%).

A third of adults under age 35 say it is extremely or very important that the U.S. play an active role in world affairs.

What do Americans think about fewer people choosing to have children?

The share of U.S. adults younger than 50 without children who say they are unlikely to ever have children rose from 37% in 2018 to 47% in 2023.

All publications >

Most Popular

Sign up for the briefing.

Weekly updates on the world of news & information

Election 2024

How latino voters view the 2024 presidential election.

While Latino voters have favored Democratic candidates in presidential elections for many decades, the margin of support has varied.

10 facts about Republicans in the U.S.

Third-party and independent candidates for president often fall short of early polling numbers, americans’ views of government’s role: persistent divisions and areas of agreement, cultural issues and the 2024 election.

All Election 2024 research >

Quiz: Test your polling knowledge

The hardships and dreams of asian americans living in poverty, what public k-12 teachers want americans to know about teaching, how people in 24 countries think democracy can improve.

All Features >

International Affairs

72% of americans say the u.s. used to be a good example of democracy, but isn’t anymore.

A median of 40% of adults across 34 other countries surveyed in 2024 say U.S. democracy used to be a good example for other countries to follow.

Most People in 35 Countries Say China Has a Large Impact on Their National Economy

Large majorities in nearly all 35 nations surveyed say China has a great deal or a fair amount of influence on their country’s economic conditions.

In some countries, immigration accounted for all population growth between 2000 and 2020

In 14 countries and territories, immigration accounted for more than 100% of population growth during this period.

NATO Seen Favorably in Member States; Confidence in Zelenskyy Down in Europe, U.S.

NATO is seen more positively than not across 13 member states. And global confidence in Ukraine’s leader has become more mixed since last year.

All INTERNATIONAL AFFAIRS RESEARCH >

Internet & Technology

How americans navigate politics on tiktok, x, facebook and instagram.

X stands out as a place people go to keep up with politics. Still, some users see political posts on Facebook, TikTok and Instagram, too.

How Americans Get News on TikTok, X, Facebook and Instagram

X is still more of a news destination than these other platforms, but the vast majority of users on all four see news-related content.

72% of U.S. high school teachers say cellphone distraction is a major problem in the classroom

Some 72% of high school teachers say that students being distracted by cellphones is a major problem in their classroom.

All INTERNET & TECHNOLOGY RESEARCH >

Race & Ethnicity

What the data says about immigrants in the u.s..

In 2022, roughly 10.6 million immigrants living in the U.S. were born in Mexico, making up 23% of all U.S. immigrants.

The State of the Asian American Middle Class

The share of Asian Americans in the U.S. middle class has held steady since 2010, while the share in the upper-income tier has grown.

An Early Look at Black Voters’ Views on Biden, Trump and Election 2024

Black voters are more confident in Biden than Trump when it comes to having the qualities needed to serve another term.

A Majority of Latinas Feel Pressure To Support Their Families or To Succeed at Work

Many juggle cultural expectations and gender roles from both Latin America and the U.S., like doing housework and succeeding at work.

Asian Americans, Charitable Giving and Remittances

Overall, 64% of Asian American adults say they gave to a U.S. charitable organization in the 12 months before the survey. One-in-five say they gave to a charity in their Asian ancestral homeland during that time. And 27% say they sent money to someone living there.

All Race & Ethnicity RESEARCH >

survey and latest research articles

U.S. Surveys

Pew Research Center has deep roots in U.S. public opinion research. Launched as a project focused primarily on U.S. policy and politics in the early 1990s, the Center has grown over time to study a wide range of topics vital to explaining America to itself and to the world.

survey and latest research articles

International Surveys

Pew Research Center regularly conducts public opinion surveys in countries outside the United States as part of its ongoing exploration of attitudes, values and behaviors around the globe.

survey and latest research articles

Data Science

Pew Research Center’s Data Labs uses computational methods to complement and expand on the Center’s existing research agenda.

survey and latest research articles

Demographic Research

Pew Research Center tracks social, demographic and economic trends, both domestically and internationally.

survey and latest research articles

All Methods research >

Our Experts

“A record 23 million Asian Americans trace their roots to more than 20 countries … and the U.S. Asian population is projected to reach 46 million by 2060.”

A headshot of Neil Ruiz, head of new research initiatives and associate director of race and ethnicity research.

Neil G. Ruiz , Head of New Research Initiatives

Key facts about asian americans >

Methods 101 Videos

Methods 101: random sampling.

The first video in Pew Research Center’s Methods 101 series helps explain random sampling – a concept that lies at the heart of all probability-based survey research – and why it’s important.

Methods 101: Survey Question Wording

Methods 101: mode effects, methods 101: what are nonprobability surveys.

All Methods 101 Videos >

Add Pew Research Center to your Alexa

Say “Alexa, enable the Pew Research Center flash briefing”

Signature Reports

Race and lgbtq issues in k-12 schools, representative democracy remains a popular ideal, but people around the world are critical of how it’s working, americans’ dismal views of the nation’s politics, measuring religion in china, diverse cultures and shared experiences shape asian american identities, parenting in america today, editor’s pick, who are you the art and science of measuring identity, electric vehicle charging infrastructure in the u.s., 8 in 10 americans say religion is losing influence in public life, how americans view weight-loss drugs and their potential impact on obesity in the u.s., most americans continue to say their side in politics is losing more often than it is winning, immigration & migration, how temporary protected status has expanded under the biden administration, key facts about asian americans living in poverty, latinos’ views on the migrant situation at the u.s.-mexico border, migrant encounters at the u.s.-mexico border hit a record high at the end of 2023, what we know about unauthorized immigrants living in the u.s., social media, 6 facts about americans and tiktok, whatsapp and facebook dominate the social media landscape in middle-income nations, how teens and parents approach screen time, majorities in most countries surveyed say social media is good for democracy, a declining share of adults, and few teens, support a u.s. tiktok ban.

901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

© 2024 Pew Research Center

Understanding and Evaluating Survey Research

  • December 2015
  • Journal of the Advanced Practitioner in Oncology 6(2):168-171
  • 6(2):168-171
  • This person is not on ResearchGate, or hasn't claimed this research yet.

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations

Abdulhakim B Jamjoom

  • Abdulhadi Y Gahtani

Jude Jamjoom

  • Alana Murphy-Dooley

Sarah Jane Flaherty

  • Rachel Flynn

Chika Kate Obiechina

  • Mary Grace C. Nueva

Andiswa Nicol Nxumalo

  • Sirlene Luz Penha

Kelli borges dos Santos

  • Cláudia Maria Messias
  • Vilanice Alves de Araújo Püschel

Samah Al Thehli

  • Yuki Shirai

Mariko Asai

  • Peter I Buerhaus

Catherine M Desroches

  • Carol D. Ryff

David M Almeida

  • Suzanne Mellon

Susan L Beck

  • J PSYCHOSOM RES

Ingvar Bjelland

  • Dag Neckelmann
  • Ingvar Bjelland
  • Tone Tangen Haug
  • D A Dillman
  • L M Christian
  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

Illustration of several warped and stretched analog clocks swirling around a central point.

The Scientist spoke with Maximilien Chaumon about his database showing how COVID-19 related lockdowns warped more 2,800 people’s perception of time.

Artist’s rendering of aquamarine T cells in front of a blue and green background.

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Survey Research | Definition, Examples & Methods

Survey Research | Definition, Examples & Methods

Published on August 20, 2019 by Shona McCombes . Revised on June 22, 2023.

Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps:

  • Determine who will participate in the survey
  • Decide the type of survey (mail, online, or in-person)
  • Design the survey questions and layout
  • Distribute the survey
  • Analyze the responses
  • Write up the results

Surveys are a flexible method of data collection that can be used in many different types of research .

Table of contents

What are surveys used for, step 1: define the population and sample, step 2: decide on the type of survey, step 3: design the survey questions, step 4: distribute the survey and collect responses, step 5: analyze the survey results, step 6: write up the survey results, other interesting articles, frequently asked questions about surveys.

Surveys are used as a method of gathering data in many different fields. They are a good choice when you want to find out about the characteristics, preferences, opinions, or beliefs of a group of people.

Common uses of survey research include:

  • Social research : investigating the experiences and characteristics of different social groups
  • Market research : finding out what customers think about products, services, and companies
  • Health research : collecting data from patients about symptoms and treatments
  • Politics : measuring public opinion about parties and policies
  • Psychology : researching personality traits, preferences and behaviours

Surveys can be used in both cross-sectional studies , where you collect data just once, and in longitudinal studies , where you survey the same sample several times over an extended period.

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Before you start conducting survey research, you should already have a clear research question that defines what you want to find out. Based on this question, you need to determine exactly who you will target to participate in the survey.

Populations

The target population is the specific group of people that you want to find out about. This group can be very broad or relatively narrow. For example:

  • The population of Brazil
  • US college students
  • Second-generation immigrants in the Netherlands
  • Customers of a specific company aged 18-24
  • British transgender women over the age of 50

Your survey should aim to produce results that can be generalized to the whole population. That means you need to carefully define exactly who you want to draw conclusions about.

Several common research biases can arise if your survey is not generalizable, particularly sampling bias and selection bias . The presence of these biases have serious repercussions for the validity of your results.

It’s rarely possible to survey the entire population of your research – it would be very difficult to get a response from every person in Brazil or every college student in the US. Instead, you will usually survey a sample from the population.

The sample size depends on how big the population is. You can use an online sample calculator to work out how many responses you need.

There are many sampling methods that allow you to generalize to broad populations. In general, though, the sample should aim to be representative of the population as a whole. The larger and more representative your sample, the more valid your conclusions. Again, beware of various types of sampling bias as you design your sample, particularly self-selection bias , nonresponse bias , undercoverage bias , and survivorship bias .

There are two main types of survey:

  • A questionnaire , where a list of questions is distributed by mail, online or in person, and respondents fill it out themselves.
  • An interview , where the researcher asks a set of questions by phone or in person and records the responses.

Which type you choose depends on the sample size and location, as well as the focus of the research.

Questionnaires

Sending out a paper survey by mail is a common method of gathering demographic information (for example, in a government census of the population).

  • You can easily access a large sample.
  • You have some control over who is included in the sample (e.g. residents of a specific region).
  • The response rate is often low, and at risk for biases like self-selection bias .

Online surveys are a popular choice for students doing dissertation research , due to the low cost and flexibility of this method. There are many online tools available for constructing surveys, such as SurveyMonkey and Google Forms .

  • You can quickly access a large sample without constraints on time or location.
  • The data is easy to process and analyze.
  • The anonymity and accessibility of online surveys mean you have less control over who responds, which can lead to biases like self-selection bias .

If your research focuses on a specific location, you can distribute a written questionnaire to be completed by respondents on the spot. For example, you could approach the customers of a shopping mall or ask all students to complete a questionnaire at the end of a class.

  • You can screen respondents to make sure only people in the target population are included in the sample.
  • You can collect time- and location-specific data (e.g. the opinions of a store’s weekday customers).
  • The sample size will be smaller, so this method is less suitable for collecting data on broad populations and is at risk for sampling bias .

Oral interviews are a useful method for smaller sample sizes. They allow you to gather more in-depth information on people’s opinions and preferences. You can conduct interviews by phone or in person.

  • You have personal contact with respondents, so you know exactly who will be included in the sample in advance.
  • You can clarify questions and ask for follow-up information when necessary.
  • The lack of anonymity may cause respondents to answer less honestly, and there is more risk of researcher bias.

Like questionnaires, interviews can be used to collect quantitative data: the researcher records each response as a category or rating and statistically analyzes the results. But they are more commonly used to collect qualitative data : the interviewees’ full responses are transcribed and analyzed individually to gain a richer understanding of their opinions and feelings.

Next, you need to decide which questions you will ask and how you will ask them. It’s important to consider:

  • The type of questions
  • The content of the questions
  • The phrasing of the questions
  • The ordering and layout of the survey

Open-ended vs closed-ended questions

There are two main forms of survey questions: open-ended and closed-ended. Many surveys use a combination of both.

Closed-ended questions give the respondent a predetermined set of answers to choose from. A closed-ended question can include:

  • A binary answer (e.g. yes/no or agree/disagree )
  • A scale (e.g. a Likert scale with five points ranging from strongly agree to strongly disagree )
  • A list of options with a single answer possible (e.g. age categories)
  • A list of options with multiple answers possible (e.g. leisure interests)

Closed-ended questions are best for quantitative research . They provide you with numerical data that can be statistically analyzed to find patterns, trends, and correlations .

Open-ended questions are best for qualitative research. This type of question has no predetermined answers to choose from. Instead, the respondent answers in their own words.

Open questions are most common in interviews, but you can also use them in questionnaires. They are often useful as follow-up questions to ask for more detailed explanations of responses to the closed questions.

The content of the survey questions

To ensure the validity and reliability of your results, you need to carefully consider each question in the survey. All questions should be narrowly focused with enough context for the respondent to answer accurately. Avoid questions that are not directly relevant to the survey’s purpose.

When constructing closed-ended questions, ensure that the options cover all possibilities. If you include a list of options that isn’t exhaustive, you can add an “other” field.

Phrasing the survey questions

In terms of language, the survey questions should be as clear and precise as possible. Tailor the questions to your target population, keeping in mind their level of knowledge of the topic. Avoid jargon or industry-specific terminology.

Survey questions are at risk for biases like social desirability bias , the Hawthorne effect , or demand characteristics . It’s critical to use language that respondents will easily understand, and avoid words with vague or ambiguous meanings. Make sure your questions are phrased neutrally, with no indication that you’d prefer a particular answer or emotion.

Ordering the survey questions

The questions should be arranged in a logical order. Start with easy, non-sensitive, closed-ended questions that will encourage the respondent to continue.

If the survey covers several different topics or themes, group together related questions. You can divide a questionnaire into sections to help respondents understand what is being asked in each part.

If a question refers back to or depends on the answer to a previous question, they should be placed directly next to one another.

Before you start, create a clear plan for where, when, how, and with whom you will conduct the survey. Determine in advance how many responses you require and how you will gain access to the sample.

When you are satisfied that you have created a strong research design suitable for answering your research questions, you can conduct the survey through your method of choice – by mail, online, or in person.

There are many methods of analyzing the results of your survey. First you have to process the data, usually with the help of a computer program to sort all the responses. You should also clean the data by removing incomplete or incorrectly completed responses.

If you asked open-ended questions, you will have to code the responses by assigning labels to each response and organizing them into categories or themes. You can also use more qualitative methods, such as thematic analysis , which is especially suitable for analyzing interviews.

Statistical analysis is usually conducted using programs like SPSS or Stata. The same set of survey data can be subject to many analyses.

Finally, when you have collected and analyzed all the necessary data, you will write it up as part of your thesis, dissertation , or research paper .

In the methodology section, you describe exactly how you conducted the survey. You should explain the types of questions you used, the sampling method, when and where the survey took place, and the response rate. You can include the full questionnaire as an appendix and refer to it in the text if relevant.

Then introduce the analysis by describing how you prepared the data and the statistical methods you used to analyze it. In the results section, you summarize the key results from your analysis.

In the discussion and conclusion , you give your explanations and interpretations of these results, answer your research question, and reflect on the implications and limitations of the research.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Prospective cohort study

Research bias

  • Implicit bias
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic
  • Social desirability bias

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analyzing data from people using questionnaires.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviors. It is made up of 4 or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with 5 or 7 possible responses, to capture their degree of agreement.

Individual Likert-type questions are generally considered ordinal data , because the items have clear rank order, but don’t have an even distribution.

Overall Likert scale scores are sometimes treated as interval data. These scores are considered to have directionality and even spacing between them.

The type of data determines what statistical tests you should use to analyze your data.

The priorities of a research design can vary depending on the field, but you usually have to specify:

  • Your research questions and/or hypotheses
  • Your overall approach (e.g., qualitative or quantitative )
  • The type of design you’re using (e.g., a survey , experiment , or case study )
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods (e.g., questionnaires , observations)
  • Your data collection procedures (e.g., operationalization , timing and data management)
  • Your data analysis methods (e.g., statistical tests  or thematic analysis )

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. (2023, June 22). Survey Research | Definition, Examples & Methods. Scribbr. Retrieved August 7, 2024, from https://www.scribbr.com/methodology/survey-research/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, questionnaire design | methods, question types & examples, what is a likert scale | guide & examples, what is your plagiarism score.

Are you considering the survey participant’s experience?

Testing Surveys Participant Experience

After a hiatus from participating in online panel surveys, researcher Ben Tolchinsky revisited the practice to assess changes in the survey experience since the mid-2000s. Joining 16 panels and taking hundreds of surveys on various devices, the author conducted a qualitative assessment to determine if the experience had improved. This article details the findings from Tolchinsky’s survey participation.

Evaluating the survey participant experience

Editor’s note: Ben Tolchinsky is owner of CBT Insights, Atlanta. 

When was the last time you participated in an online panel survey?

For me, it had been a while. Early in my career, in the mid-2000s, I joined a number of online panels so that I could stay “on the pulse” of online surveys and the industry. I developed three beliefs from this experience:

  • Surveys are too long.
  • Surveys are boring.
  • Surveys are unrewarding.

I’ve maintained these beliefs over the years, not having encountered anything that would change them. However, despite my beliefs and best efforts, I’ve contributed my share of lengthy and boring surveys. Since that early experiment, I occasionally participate in surveys, mostly from companies of which I’m a customer.

Because of my beliefs, I’ve often wondered and worried about the health of the lifeblood of our industry: survey participation. Despite my concerns, my sample needs have always been met.

Early in 2024, I had an itch to evaluate the survey participant experience once again. Has it changed? Has it improved? Are surveys shorter, more enjoyable and more rewarding than they were in the mid-2000s? I wasn’t optimistic.

To help answer my questions, I joined 16 panels (see Table A), and over the course of many months, I participated in hundreds of online surveys on my desktop PC, laptop, iPhone and iPad. All surveys were taken on the Microsoft Edge browser, and I did not use any survey apps.

In this article I’ll look at each experience, in order, that a panelist encounters when participating in an online survey. Although the sample size of surveys taken is large, I did not record and quantify my experience. My conclusion is based on a qualitative assessment of my collective experiences.

Survey   recruitment

Much like years ago, I received e-mail invitations to take surveys. And much like years ago, the number of e-mail invitations varied widely by panel, with as few as one per week to as many as 80 per week (see Table A). From my participant perspective, fewer invitations made me feel “fortunate” to have been invited to the survey, whereas the larger number of invitations seemed excessive to me. Others may appreciate the frequent notifications of survey availability.

However, unlike my experience from years ago, the invitation rarely took me to the survey advertised in the e-mail, instead placing me in a “router” that would either find a survey for me in about 15-to-30 seconds or take me to a set of general screening questions before putting me in the router.

I should note that two of the 16 panels operated more like those from years past. For these two, I would receive an invitation for an advertised survey and then be taken directly to that survey.

However, unlike in the past, I would be presented with another survey opportunity if I was disqualified.

The other unique aspect of the recruitment process for me was that most panels have a survey dashboard that I could visit at any time or be returned to after completing or disqualifying from a survey. On the dashboard, I could choose from several surveys ranging in length and reward. If I wanted a shorter survey, I could typically find one with a smaller reward. And if I was ambitious or motivated by larger rewards, I could typically find one ranging from 20 to even 45 or 60 minutes. Dashboards can also stimulate participation in specific surveys by increasing the number of rewards points per minute, which are easily comparable across surveys.

I recall many complaints in the past from panelists who did not like being disqualified from a survey and not having other opportunities until a new invitation arrived (of which the same experience would often occur). Although there are pros (maximizes panelist utility) and cons (impact on sample representativeness) to dashboards for researchers, today’s recruitment approaches have eliminated this participant pain point.

Survey   screening

I recall attending conferences where panel companies and others would discuss the need to utilize panelist demographic information so that participants did not have to answer the same demographic questions repeatedly while attempting to qualify for a survey.

Unfortunately, this need has not been met.

Hence, the process of attempting to qualify for a survey requires “routing” from one survey to the next, answering the same demographic questions repeatedly until finally qualifying or running out of available surveys. It can be incredibly frustrating, even laughable at times.

There were two other interesting aspects of the screening process:

  • This is not social or political commentary, but it surprised me how many ways there now are to ask for a participant’s gender – one question had 13 options. Perhaps those who qualify for these responses are appreciative, but I think the many variations draw unwanted attention to a complex issue. Perhaps a standard will emerge.
  • Bot detectors are now commonplace, appearing in most surveys. They’re not necessarily problematic, but to those who aren’t familiar with them or their purpose might question why the survey is asking which one of the photos is an apple. At times, I had to prove I was a human two or three times.

It should be noted here that I answered every screening and survey question honestly, as a consumer, except for one. I, like most panelists, know how to dodge the security screener, and I did not give myself up each time.

Survey   length

Although I don’t have data points, I believe that it’s generally agreed that many surveys are too long and that survey participants much prefer shorter surveys to lengthy surveys. In my experience, practitioners (including me) generally agree that surveys really shouldn’t exceed 15, maybe 20, minutes – largely to maintain broad survey participation and to ensure data quality. At conferences, practitioners usually debate which stakeholder – the corporate buyers, the market research suppliers, the panel companies – should own and enforce the effort to shorten surveys. A partnership among leading companies from each stakeholder is typically sought, but it is seemingly an impossible task.

Based on my experiment, I don’t believe anything has changed. While there are 5-, 10- and 15- minute surveys, there are also many 20-, 30-, 45- and even 60-minute surveys. Because of this experiment, I did not shy away from the lengthier surveys. I tried to complete them – I really did – but I often couldn’t. They seemed to go on and on with no end in sight. Furthermore, once my eyes began to glaze over and the frustration set in (around the 20-minute mark), I could no longer vouch for the quality of my responses, so it is probably better that I often gave up. If I was motivated by the incentive/reward, perhaps I would have marched on.

It wasn’t just the survey length. It was the repetitiveness, the length of the attribute batteries, the endless detailed questions about brands I know little about or everyday experiences I can barely recall. I’m a researcher, and I know why we do this, but please take a 30-minute or longer survey and see if you disagree.

Survey   enjoyment

Based on my previous account, this assessment will not surprise you. When comparing my experience in the mid-2000s to my recent experience, I noticed only minor improvements on this dimension to a minority of the surveys I took. The improvements generally came in the form of easier administration of attributes and other types of batteries that require repetition.

Sometimes the attributes would appear one by one, and on occasion, I could drag-and-drop responses into rank order or onto a fixed scale.

I didn’t expect to be entertained, but I expected the experience to be an ounce or two more enjoyable (or less boring) than previously based simply on advancements in technology. I’ve seen wonderful examples of unique and/or gamified question types that are seemingly more “enjoyable” and engaging, but I didn’t encounter any of these, not once. I have little doubt that it would encourage more/repeat participation, increase the likelihood of completion and produce higher quality data – particularly for lengthier surveys. But the industry appears uninterested in making the necessary investments to achieve these benefits.

Survey   quality

Survey quality was the biggest surprise for me. In my opinion, survey quality has declined. The issues that led me to this conclusion were neither egregious nor rampant. Many were of the basic variety, including misspellings, missing words and poor punctuation. More concerning issues included poorly worded questions, leading questions, unanswerable questions, confusing or incorrect scales, missing or failed programming logic and more. The most common (and frustrating) issue was the failure to include a “none of these” or “don’t know” option – when this occurred, I was forced to randomly select a response, and I shuddered at the thought of someone presenting this data.

Perhaps it shouldn’t have been a surprise. Software as a service (SaaS) survey platforms have increased in popularity due to cost and speed improvements by allowing essentially anyone in an organization to write and deploy a survey. I am not going to list the pros and cons of SaaS survey platforms, but it seems apparent to me that some, even many, surveys are not drafted by an experienced and trained practitioner.  

I notice the issues, but I suspect that most panelists either don’t notice them or don’t care. And maybe that’s the point. I’m beginning to believe that “good enough” is now the standard and that organizations believe benefits of SaaS platforms outweigh some amount of bias that no one will ever know or care about.

There was one pleasant surprise, however. Surveys taken on my iPhone browser were better than I expected. Some were optimized for the smartphone and others required a bit of pinching and shifting, whereas only a small number were no different than on a computer browser and unreadable for my worsening eyesight.

Survey   incentives and rewards

As previously stated, it is my belief that surveys are unrewarding. As shown in Table A, on three of the 16 panels, it could take as few as four or five surveys to earn a $5 award. Not bad. Keep in mind, however, that those four or five surveys need to be of the 20-minute+ variety, and the participant must quality for and complete the surveys – not always easy tasks. But at least the reward feels attainable and tangible. Participants no longer have to use their earned currency for magazine subscriptions – today’s rewards are cash or gift cards (Amazon is almost always one of the choices).

As you will also see in Table A, most panels require significantly more completed surveys to earn the baseline reward. In my estimation, it is quite challenging to earn a reward, which signifies “unrewarding” or at least mildly rewarding to me.

The survey participant experience

Based on this experiment, I’ve come to three conclusions. The first is that, in my opinion, surveys remain very much the same as in the mid-2000s – too long, boring and unrewarding.

The second conclusion is more revealing. It seems apparent to me that the MR industry does not need to improve the participant experience, or it would have improved it by now. Long, boring and unrewarding surveys are completed every day, and if quotas are consistently met, why does anything need to change? Perhaps I should be pleased that the lifeblood of our industry – survey participation – is theoretically sufficiently healthy.

My third conclusion is that I must be an idealogue who yearns for something better but is not pragmatic. I get it … if it ain’t broke, don’t fix it.

Ensuring quality survey responses in an age of uncertainty Related Categories: Online Surveys, Online Survey Design/Analysis, Panels-Online Online Surveys, Online Survey Design/Analysis, Panels-Online, Research Industry, Data Analysis, Data Quality, Fraud Detection, Validation-Respondent

Engaging Gen Z: Using visual enhancements to improve survey engagement Related Categories: Survey Design, Online Survey Design/Analysis, Survey Research Survey Design, Online Survey Design/Analysis, Survey Research, Generation Z, Research Industry

23 Top 
Online Research Companies Related Categories: Online Surveys, Online Survey Design/Analysis, Panels-Online, Survey Research Online Surveys, Online Survey Design/Analysis, Panels-Online, Survey Research, Research Industry, Bus.-To-Bus. Research, Data Analysis, Data Collection Field Services, Focus Group-Online, Hybrid Research (Qual/Quant), Online Communities - MROC, Online Research, Online Research Consultation, Qualitative Research, Qualitative-Online, Quantitative Research, Software-Data Analysis, Software-Online Qualitative, Software-Online Surveys

Video surveys: The key to reliable consumer insights Related Categories: Survey Design, Online Surveys, Survey Research Survey Design, Online Surveys, Survey Research, Research Industry, Audience Research, CX/UX-Customer/User Experience, Consumer Research, In-Store Research, Product Testing Research, Shopper Insights, Video Recording

Autonomous Cars and Consumer Choices: A Stated Preference Approach

  • Published: 06 August 2024

Cite this article

survey and latest research articles

  • Hiroaki Miyoshi   ORCID: orcid.org/0000-0003-1466-0475 1  

This study aims to identify the factors that influence consumers' choice of autonomous cars in their new car purchase behavior. We surveyed stated preferences for autonomous driving technology and then analyzed the survey data using a multinominal logit model to examine how various factors—including price, consumer attributes, and respondents' evaluations of autonomous car characteristics—influence consumers' choice of car technology. The results of analysis indicate that experience with adaptive cruise control (ACC), as well as expectations about the enjoyment that autonomous cars will bring to consumers' daily lives, have a significant impact on their decision.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

survey and latest research articles

[ 13 ] provides a comprehensive review of the literature on stated preference/choice studies related to automated vehicles.

Contents of SP surveys in this study are described in detail in [ 14 ].

To facilitate understanding, the three car classes (ADAS-equipped cars, Level-3 cars, and Level-4 cars) were described to survey respondents respectively as assisted-driving cars , partially autonomous cars , and highly autonomous cars .

Note that subclass R3 is not used in the survey.

For this portion of the survey, a range of highways permitting autonomous driving was indicated, and survey respondents were shown one of the following three options, selected at random: (1) Major highways (Tomei, etc.), (2) Highways with two lanes in each direction, (3) All highways.

This refers to the frequency with which take-over requests (TORs) are issued. Survey respondents were shown one of the following four options, selected at random: (1) once per hour, (2) once per day, (3) once per week, (4) once per month.

This refers to the length of time within which a human driver must respond to a take-over request. Survey respondents were shown one of the following four options, selected at random: (1) immediately, (2) within 3 seconds, (3) within 10 seconds, (4) within 30 seconds.

This refers to the frequency of minimum-risk maneuver (MRM) incidents. Survey respondents were shown one of the following four options, selected at random: (1) once per day, (2) once per month, (3) once per year, (4) once every two years.

See [ 17 ] for further information except (B-4).

This survey queried household income in a multiple-choice format, with a single option for all income levels above 20 million yen. To exclude artifacts of this question design from the explanatory variable for household income, we introduce a dummy variable indicating income above 20 million yen.

For example, gasoline cars with the engine displacement of above 2000cc are included in standard sized cars.

Abbreviations

Advanced driver-assistance systems

Autonomous emergency braking

Driving assistant

Take-over request

Minimum-risk maneuver

Adaptive cruise control

Nordhoff, S., Kyriakidis, M., Van Arem, B., Happee, R.: A multi-level model on automated vehicle acceptance (MAVA): a review-based study. Theor. Issues Ergon. Sci. 20 (6), 682–710 (2019). https://doi.org/10.1080/1463922x.2019.1621406

Article   Google Scholar  

Venkatesh, V., Morris, M.G., Davis, G.B., Davis, F.D.: User acceptance of information technology: toward a unified view. MIS Q. 27 (3), 425–478 (2003). https://doi.org/10.2307/30036540

Venkatesh, V., Thong, J.Y.L., Xu, X.: Unified theory of acceptance and use of technology: a synthesis and the road ahead. J. Assoc. Inf. Syst. 17 (5), 328–376 (2016). https://doi.org/10.17705/1jais.00428

Bansal, P., Kockelman, K.M., Singh, A.: Assessing public opinions of and interest in new vehicle technologies: an Austin perspective. Transp. Res. Part C: Emerg. Technol. 67 , 1–14 (2016). https://doi.org/10.1016/j.trc.2016.01.019

Jiang, Y., Zhang, J., Wang, Y., Wang, W.: Capturing ownership behavior of autonomous vehicles in Japan based on a stated preference survey and a mixed logit model with repeated choices. Int. J. Sustain. Transp. 13 (10), 788–801 (2018). https://doi.org/10.1080/15568318.2018.1517841

Liu, P., Guo, Q., Ren, F., Wang, L., Xu, Z.: Willingness to pay for self-driving vehicles: Influences of demographic and psychological factors. Transp. Res. C: Emerg. Technol. 100 , 306–317 (2019). https://doi.org/10.1016/j.trc.2019.01.022

Liu, P., Yang, R., Xu, Z.: Public acceptance of fully automated driving: effects of social trust and risk/benefit perceptions. Risk Anal. 39 (2), 326–341 (2019). https://doi.org/10.1111/risa.13143

Shabanpour, R., Golshani, N., Shamshiripour, A., Mohammadian, A.: Eliciting preferences for adoption of fully automated vehicles using best-worst analysis. Transp. Res. C Part C: Emerg. Technol. 93 , 463–478 (2018). https://doi.org/10.1016/j.trc.2018.06.014

Tan, L., Ma, C., Xu, X., and Xu, J., Choice behavior of autonomous vehicles based on logistic models. Sustainability, 12(1): (2019). https://doi.org/10.3390/su12010054

Daziano, R.A., Sarrias, M., Leard, B.: Are consumers willing to pay to let cars drive for them? Analyzing response to autonomous vehicles. Transp. Res. Part C: Emerg. Technol. 78 , 150–164 (2017). https://doi.org/10.1016/j.trc.2017.03.003

Haboucha, C.J., Ishaq, R., Shiftan, Y.: User preferences regarding autonomous vehicles. Transp. Res. Part C: Emerg. Technol. 78 , 37–49 (2017). https://doi.org/10.1016/j.trc.2017.01.010

Potoglou, D., Whittle, C., Tsouros, I., Whitmarsh, L.: Consumer intentions for alternative fuelled and autonomous vehicles: a segmentation analysis across six countries. Transp. Res. Part D: Transp. Environ. 79 , 1–17 (2020). https://doi.org/10.1016/j.trd.2020.102243

Gkartzonikas, C., Gkritza, K.: What have we learned? A review of stated preference and choice studies on autonomous vehicles. Transp. Res. Part C: Emerg. Technol. 98 , 323–337 (2019). https://doi.org/10.1016/j.trc.2018.12.003

University of Tokyo, and Doshisha University (commissioned by NEDO), Research on assessment of the impact of automated driving on society and the economy and on measures to promote deployment (In Japanese), 2023 available at https://www.sip-adus.go.jp/rd/ . Accessed 30 Nov 2023

Suda, Y., Miyoshi, H.: Development of assessment methodology for socioeconomic impacts of automated driving including traffic accident reduction. In: SIP 2nd Phase: Automated Driving for Universal Services-Final Results Report (2018-2022), pp. 173–179. (2022). Available at https://www.sip-adus.go.jp/rd/rd_page04.php . Accessed 30 Nov 2023

Google Scholar  

Nishihori, Y., Morikawa, T.: Analysis of the important factors affecting the acceptance of autonomous vehicles before and after test ride: Considering the awareness difference for technology and contents of test ride. J City Plann Inst Jpn 54 (3), 696–702 (2019). Available at https://cir.nii.ac.jp/crid/1390845702311790720 . Accessed 30 Nov 2023 (In Japanese)

Miyaki, Y.: Surveys and evaluations for fostering public acceptance. In: SIP 2nd Phase: Automated Driving for Universal Services-SIP 2nd Phase —Mid-Term Results Report (2018–2020), 021, pp. 124-129. Available at https://www.sip-adus.go.jp/rd/rd_page03.php . Accessed 30 Nov 2023

Kyriakidis, M., Happee, R., de Winter, J.C.F.: Public opinion on automated driving: results of an international questionnaire among 5000 respondents. Transp. Res. Part F: Traff. Psychol. Behav. 32 , 127–140 (2015). https://doi.org/10.1016/j.trf.2015.04.014

Dong, X., DiScenna, M., Guerra, E.: Transit user perceptions of driverless buses. Transportation 46 (1), 35–50 (2017). https://doi.org/10.1007/s11116-017-9786-y

Krueger, R., Rashidi, T.H., Rose, J.M.: Preferences for shared autonomous vehicles. Transp. Res. Part C: Emerg. Technol. 69 , 343–355 (2016). https://doi.org/10.1016/j.trc.2016.06.015

Menon, N.: Autonomous vehicles: An empirical assessment of consumers' perceptions, intended adoption, and impacts on household vehicle ownership, University of South Florida, available at https://digitalcommons.usf.edu . Accessed 30 Oct 2023

Nordhoff, S., de Winter, J., Kyriakidis, M., van Arem, B., and Happee, R., “Acceptance of driverless vehicles: results from a large cross-national questionnaire study”, J. Adv. Transp. 1-22 (2018). https://doi.org/10.1155/2018/5382192

Revelle, W.R.: psych: Procedures for personality and psychological research (Version 2.3.9) [Computer software]. 2023, available at https://cran.r-project.org/web/packages/psych/psych.pdf (accessed September 30, 2023)

Croissant, Y.: Estimation of multinomial logit models in R: The mlogit Packages. J. Stat. Softw. 95 (11), 1–41 (2020). https://doi.org/10.18637/jss.v095.i11

Taniguchi, A., Enoch, M., Theofilatos, A., Ieromonachou, P.: Understanding acceptance of autonomous vehicles in Japan, UK, and Germany. Urban Plann. Transp. Res. 10 (1), 514–535 (2022). https://doi.org/10.1080/21650020.2022.2135590

Miyaki, Y.: Public attitudes towards automated driving. Trends in the Sciences 27 (2), 100–104 (2022). https://doi.org/10.5363/tits.27.2_100 . (In Japanese)

Download references

Acknowledgements

This paper is based on results obtained from a project, JPNP18012, commissioned by the New Energy and Industrial Technology Development Organization (NEDO). For the survey discussed in this paper, the questions regarding consumer acceptance that appear on the survey page titled “Survey questions probing awareness and opinions of autonomous driving technology” are questions that were previously used to gauge social acceptance within another project in JPNP18012 . We extend our heartfelt gratitude to NEDO and Yukiko Miyaki, executive chief researcher at Dai-Ichi Research Institute Inc., for granting permission to use this material. The survey was conducted with the participation of Shoji Watanabe, a researcher at Doshisha University. We sincerely appreciate his efforts in conducting the survey. We are also grateful for the many valuable suggestions we received from researchers at the University of Tokyo.

The online surveys conducted for this study were reviewed and approved (case number 2021-09) by the ethics committee of Doshisha University’s Institute for Technology, Enterprise and Competitiveness.

Author information

Authors and affiliations.

Faculty of Policy Studies, Doshisha University, Karasuma-higashi-iru, Imadegawa-dori, Kamigyo-ku, Kyoto, 602-8580, Japan

Hiroaki Miyoshi

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Hiroaki Miyoshi .

Ethics declarations

Conflict of interest.

The authors declare that they have no conflict of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Miyoshi, H. Autonomous Cars and Consumer Choices: A Stated Preference Approach. Int. J. ITS Res. (2024). https://doi.org/10.1007/s13177-024-00408-1

Download citation

Received : 09 December 2023

Revised : 23 April 2024

Accepted : 10 June 2024

Published : 06 August 2024

DOI : https://doi.org/10.1007/s13177-024-00408-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Autonomous car
  • Social acceptance
  • Consumer expectation
  • Stated preference
  • Factor analysis
  • Multinominal logit model
  • Adaptive cruise control
  • Find a journal
  • Publish with us
  • Track your research

survey and latest research articles

Industry in Focus

survey and latest research articles

Good Growth for Cities

survey and latest research articles

Sustainable economy

survey and latest research articles

Rethink Risk

survey and latest research articles

Business in focus

survey and latest research articles

Transformation

survey and latest research articles

Managed Services

survey and latest research articles

175 years of PwC

survey and latest research articles

Annual Report

survey and latest research articles

What is The New Equation?

Loading Results

No Match Found

Invest in team sports say 41% of experts with women’s sport on track for continued growth - new PwC Global Sports Survey

  • Press Release
  • 02 Aug 2024

Teams and leagues (41%), gaming (22%) and tech (17%) are the most attractive sectors to invest in

85% of sports experts predict double digit growth in women’s sport

59% of sports organisations do not have a strategy for GenAI incorporation into their business model

PwC unveiled its 8th Edition of its Global Sports Survey today, showcasing optimistic prospects for the business of sport. The survey, which gathered insights from 411 sports leaders worldwide, examined the current state of the sports industry and their expectations for the next three to five years. Despite presenting a cautiously optimistic outlook, the report indicates positive sentiment across all regions.

With the continued growth of big US sports and global football, as well as the emergence of sports such as Padel and Pickleball, there are significant growth opportunities across the sector. 

Clive Reeves, PwC Global Sports Leader comments: 

"The report provides valuable insights into the state of the sports industry and its future prospects, from some of the top leaders and sports strategists globally. Despite economic uncertainties, the industry remains optimistic and poised for growth. This summer in particular highlights the pulling power that sports continue to have - from the Men’s UEFA Euro 2024 to the Paris Olympic and Paralympic Games.”

“We believe this survey will be an invaluable resource for businesses and industry leaders, sparking crucial discussions and debates on how the sports industry can seize growth opportunities and navigate potential challenges."

Investment in the sports industry

The sports sector is becoming highly sought-after due to continued revenue growth (based on media rights and sponsorship deals) and more sustainable business models emerging. Sports experts surveyed are optimistic about institutional investors' prospects, especially with growth in the sports market. While growth isn't universal, certain investors are well-positioned to seize opportunities in this evolving landscape. Of the 411 experts surveyed by PwC, they identified teams and leagues (41%), gaming (22%) and tech (17%) as the top investment areas. 

The scarcity of investable sports assets has driven up valuations in recent transactions, prompting shifts in investment strategies and structures. Relaxing ownership regulations in the US and the continued rise in asset valuations are attracting evermore diverse investors. Sports leaders surveyed conveyed a growing popularity for minority investment and joint venture investment (both 36%). Dedicated sports investment funds and athlete- backed funds are becoming more active. They are partnering with operators to unlock value and seek synergies across portfolios with athlete funds in particular mobilising the pulling power of attracting bigger audiences. 

Major sporting events

44% of sports executives believe financial concerns are a key barrier to hosting a large event. Despite stating the benefits of investment and infrastructure (25%) and the tourism benefits (24%) of hosting such events, experts still believe that they can lack public support (19%) and a lack of suitable facilities to host in (15%). The majority of those surveyed believe that there needs to be new hosting models (64%) and utilisation of existing venues (60%) to help future major sporting events be delivered in a sustainable way.

86% of experts also believe a multi-location model is best for hosting such events - great news for the upcoming EURO 2028 across the UK and Ireland, as well as the future Olympics and Paralympics . 

Women’s sport

Women's sports are experiencing a surge in interest and significant growth potential, driven by record-breaking events like the FIFA Women's World Cup (Summer 2023) and the NCAA women's basketball tournament (March 2024). Sports executives believe women’s sports will grow double-digit over the next three to five years, highlighting the significant potential still to be realised.

Sports leaders surveyed highlight that increased media coverage has been crucial for this growth and stress the importance of continued focus from broadcasters and media outlets. To attract new audiences, 18% of respondents cite increased promotion (advertising, ticket prices), 16% emphasise live broadcasting of women’s sports events, and 13% mention enhancing the matchday experience (food options, accessibility). Additionally, 12% advocate for improved athlete storytelling, and another 12% emphasise the need for family-friendly scheduling.

GenAI & Sports

Generative AI (GenAI) and other innovative technologies present significant growth opportunities for sporting organisations, but their adoption and impact vary across the industry. Sports tech and media companies are poised to benefit the most, focusing on content creation, distribution and fan engagement. Yet 59% of sports leaders reported not having a clear GenAI strategy, highlighting barriers such as funding and capability requirements, thus offering early adopters a chance to gain a competitive edge.

Media (26%), technology (21%) and fantasy/betting (16%) are the industry stakeholders that stand to benefit most from GenAI through an enhanced ability to create content more quickly, at lower cost, to unlock further commercial opportunities, driving opportunities for new pricing and fan engagement models. With many GenAI use cases still emerging, sports organisations will likely adopt a ‘wait and see’ approach. As more solutions come to market, organisations will have a greater opportunity to develop an overarching strategy and implementation plan. 

Survey results suggest that teams, leagues and federations have been slower to embrace GenAI - 67% of respondents in this category do not yet have a plan for GenAI, while 15% do not see it as relevant for the business.

At PwC, our purpose is to build trust in society and solve important problems. We’re a network of firms in 151 countries with over 364,000 people who are committed to delivering quality in assurance, advisory and tax services. Find out more and tell us what matters to you by visiting us at  www.pwc.com . PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see  www.pwc.com/structure  for further details.

© 2024 PwC. All rights reserved.

Media Enquiries

Press office, PwC United Kingdom

Gemma-Louise Bond

Corporate Affairs Manager - Retail, consumer and leisure, PwC United Kingdom

Tel: +44 (0)7483 147794

Linkedin Follow

© 2015 - 2024 PwC. All rights reserved. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity. Please see www.pwc.com/structure for further details.

  • Terms and conditions
  • Privacy Statement
  • Cookie info
  • Legal Disclaimer
  • About Site Provider
  • Provision of Services
  • Human rights and Modern Slavery Statement
  • Web Accessibility
  • About the New York Fed
  • Bank Leadership
  • Diversity and Inclusion
  • Communities We Serve
  • Board of Directors
  • Disclosures
  • Ethics and Conflicts of Interest
  • Annual Financial Statements
  • News & Events
  • Advisory Groups
  • Vendor Information
  • Holiday Schedule

At the New York Fed, our mission is to make the U.S. economy stronger and the financial system more stable for all segments of society. We do this by executing monetary policy, providing financial services, supervising banks and conducting research and providing expertise on issues that impact the nation and communities we serve.

New York Innovation Center

The New York Innovation Center bridges the worlds of finance, technology, and innovation and generates insights into high-value central bank-related opportunities.

Information Requests

Do you have a request for information and records? Learn how to submit it.

Gold Vault

Learn about the history of the New York Fed and central banking in the United States through articles, speeches, photos and video.

  • Markets & Policy Implementation
  • Reference Rates
  • Effective Federal Funds Rate
  • Overnight Bank Funding Rate
  • Secured Overnight Financing Rate
  • SOFR Averages & Index
  • Broad General Collateral Rate
  • Tri-Party General Collateral Rate
  • Desk Operations
  • Treasury Securities
  • Agency Mortgage-Backed Securities
  • Reverse Repos
  • Securities Lending
  • Central Bank Liquidity Swaps
  • System Open Market Account Holdings
  • Primary Dealer Statistics
  • Historical Transaction Data
  • Monetary Policy Implementation
  • Agency Commercial Mortgage-Backed Securities
  • Agency Debt Securities
  • Repos & Reverse Repos
  • Discount Window
  • Treasury Debt Auctions & Buybacks as Fiscal Agent
  • INTERNATIONAL MARKET OPERATIONS
  • Foreign Exchange
  • Foreign Reserves Management
  • Central Bank Swap Arrangements
  • Statements & Operating Policies
  • Survey of Primary Dealers
  • Survey of Market Participants
  • Annual Reports
  • Primary Dealers
  • Standing Repo Facility Counterparties
  • Reverse Repo Counterparties
  • Foreign Exchange Counterparties
  • Foreign Reserves Management Counterparties
  • Operational Readiness
  • Central Bank & International Account Services
  • Programs Archive
  • Economic Research
  • Consumer Expectations & Behavior
  • Survey of Consumer Expectations
  • Household Debt & Credit Report
  • Home Price Changes
  • Growth & Inflation
  • Equitable Growth Indicators
  • Multivariate Core Trend Inflation
  • New York Fed DSGE Model
  • New York Fed Staff Nowcast
  • R-star: Natural Rate of Interest
  • Labor Market
  • Labor Market for Recent College Graduates
  • Financial Stability
  • Corporate Bond Market Distress Index
  • Outlook-at-Risk
  • Treasury Term Premia
  • Yield Curve as a Leading Indicator
  • Banking Research Data Sets
  • Quarterly Trends for Consolidated U.S. Banking Organizations
  • Empire State Manufacturing Survey
  • Business Leaders Survey
  • Supplemental Survey Report
  • Regional Employment Trends
  • Early Benchmarked Employment Data
  • INTERNATIONAL ECONOMY
  • Global Supply Chain Pressure Index
  • Staff Economists
  • Visiting Scholars
  • Resident Scholars
  • PUBLICATIONS
  • Liberty Street Economics
  • Staff Reports
  • Economic Policy Review
  • RESEARCH CENTERS
  • Applied Macroeconomics & Econometrics Center (AMEC)
  • Center for Microeconomic Data (CMD)
  • Economic Indicators Calendar
  • Financial Institution Supervision
  • Regulations
  • Reporting Forms
  • Correspondence
  • Bank Applications
  • Community Reinvestment Act Exams
  • Frauds and Scams

As part of our core mission, we supervise and regulate financial institutions in the Second District. Our primary objective is to maintain a safe and competitive U.S. and global banking system.

The Governance & Culture Reform

The Governance & Culture Reform hub is designed to foster discussion about corporate governance and the reform of culture and behavior in the financial services industry.

Need to file a report with the New York Fed?

Need to file a report with the New York Fed? Here are all of the forms, instructions and other information related to regulatory and statistical reporting in one spot.

Frauds and Scams

The New York Fed works to protect consumers as well as provides information and resources on how to avoid and report specific scams.

  • Financial Services & Infrastructure
  • Services For Financial Institutions
  • Payment Services
  • Payment System Oversight
  • International Services, Seminars & Training
  • Tri-Party Repo Infrastructure Reform
  • Managing Foreign Exchange
  • Money Market Funds
  • Over-The-Counter Derivatives

The Federal Reserve Bank of New York works to promote sound and well-functioning financial systems and markets through its provision of industry and payment services, advancement of infrastructure reform in key markets and training and educational support to international institutions.

The New York Innovation Center

The New York Fed offers the Central Banking Seminar and several specialized courses for central bankers and financial supervisors.

Tri-party Infrastructure Reform

The New York Fed has been working with tri-party repo market participants to make changes to improve the resiliency of the market to financial stress.

  • Community Development & Education
  • Household Financial Well-being
  • Fed Communities
  • Fed Listens
  • Fed Small Business
  • Workforce Development
  • Other Community Development Work
  • High School Fed Challenge
  • College Fed Challenge
  • Teacher Professional Development
  • Classroom Visits
  • Museum & Learning Center Visits
  • Educational Comic Books
  • Economist Spotlight Series
  • Lesson Plans and Resources
  • Economic Education Calendar

Our Community Development Strategy

We are connecting emerging solutions with funding in three areas—health, household financial stability, and climate—to improve life for underserved communities. Learn more by reading our strategy.

Economic Inequality & Equitable Growth

The Economic Inequality & Equitable Growth hub is a collection of research, analysis and convenings to help better understand economic inequality.

Government and Culture Reform

2024:Q2 Quarterly Highlights

Labor market conditions improved slightly for recent college graduates in the second quarter of 2024. The unemployment rate edged down to 4.5 percent and the underemployment rate inched lower to 40.5 percent.

This web feature tracks employment data for recent college graduates across the United States since 1990, allowing for a historical perspective on the experience of those moving into the labor market.

  • compare the unemployment rate for recent college graduates with that of other groups
  • monitor the underemployment rate of recent college graduates

A table tracks outcomes by college major with the latest available annual data.

How to cite this report:

Federal Reserve Bank of New York, The Labor Market for Recent College Graduates, https://nyfed.org/collegelabor.

Related reading:

The data do not represent official estimates of the Federal Reserve Bank of New York, its President, the Federal Reserve System, or the Federal Open Market Committee.

We describe the framework for this analysis in “Underemployment in the Early Careers of College Graduates following the Great Recession” (NBER Studies in Income and Wealth ) and “Are Recent College Graduates Finding Good Jobs?,” a 2014 article in the New York Fed’s Current Issues in Economics and Finance series. These papers examine more than two decades of data on the employment outcomes of recent college graduates across the United States, and contain more details and historical perspective.

We launched this web feature to make some of the data featured in these papers available on a timely and updated basis. New unemployment and underemployment data for recent college graduates post on a quarterly basis (typically in February, May, August, and November), and wages and outcome data for college graduates are released on an annual basis (typically in February). Data extend from 1990 to the present. Periodic analysis of these data are published on the Liberty Street Economics blog.

Our definition of underemployment is based on the kinds of jobs held by college graduates. A college graduate working in a job that typically does not require a college degree is considered underemployed. We use survey data from the U.S. Department of Labor’s Occupational Information Network (O*NET) Education and Training Questionnaire to help determine whether a bachelor’s degree is required to perform a job. The articles cited above describe our approach in detail.

Some additional research that utilizes these data include “Working as a Barista After College Is Not as Common as You Might Think” ( Liberty Street Economics ).

Our underemployment figures are calculated as a percentage holding a bachelor’s degree or higher, so they do include those with graduate and professional degrees. See the notes below the x-axis on the Underemployment chart for more detail.

The “Share with Graduate Degree” column in the table represents, for each college major, the percentage of workers with a bachelor’s degree that also possesses a graduate degree of any kind. For example, 50.2 percent of those with a bachelor’s degree in history also possess some kind of graduate degree, based on February 2022 data.

All data presented here are national measures.

Unfortunately, we do not.

No, we only publish data that encompass the most recent year period available to utilize from the American Community Survey, which serves as the source for our analysis.

We do not have updated data by gender available in this web feature, but we did provide some gender analysis in “Underemployment in the Early Careers of College Graduates following the Great Recession.”

Unfortunately, at this time, our analysis only pertains to those with at least a bachelor’s degree.

No. Through 2023:Q2, we examined the types of jobs held by those who are underemployed, categorizing jobs broadly by skill level and pay to generate time series data for the percentages of graduates holding “good non-college jobs” and “low-wage jobs.” Starting with the 2023:Q3 update, the web feature will no longer include the data series for "underemployed job types," although historical data remain available for download .

In our definition, early career graduates are those aged 22 to 27, and mid-career graduates are those aged 35 to 45.

Boyarchenko, N., R. K. Crump, A. Kovner, and O. Shachar. 2021. “ Measuring Corporate Bond Market Dislocations .” Federal Reserve Bank of New York Staff Reports , no. 957, January, revised July.

Close

  • Request a Speaker
  • International Seminars & Training
  • Governance & Culture Reform
  • Data Visualization
  • Economic Research Tracker
  • Markets Data APIs
  • Terms of Use

Federal Reserve Bank Seal

  • Health Tech
  • Health Insurance
  • Medical Devices
  • Gene Therapy
  • Neuroscience
  • H5N1 Bird Flu
  • Health Disparities
  • Infectious Disease
  • Mental Health
  • Cardiovascular Disease
  • Chronic Disease
  • Alzheimer's
  • Coercive Care
  • The Obesity Revolution
  • The War on Recovery
  • Adam Feuerstein
  • Matthew Herper
  • Jennifer Adaeze Okwerekwu
  • Ed Silverman
  • CRISPR Tracker
  • Breakthrough Device Tracker
  • Generative AI Tracker
  • Obesity Drug Tracker
  • 2024 STAT Summit
  • All Summits
  • STATUS List
  • STAT Madness
  • STAT Brand Studio

Don't miss out

Subscribe to STAT+ today, for the best life sciences journalism in the industry

Phasing out the ‘D-word’

By Mike Zuendel Aug. 5, 2024

A pencil's eraser was dragged along a paper, leaving a trace of shavings — first opinion coverage from STAT

A s I roamed the meeting rooms and halls of the Alzheimer’s Association International Conference in Philadelphia last week, I kept hearing a word — dementia — I’ve come to loathe as someone with early Alzheimer’s.

The use of this term goes back as far as the late 1500s, when it referred to insanity. It’s an inaccurate, outdated, and stigmatizing term that I and others living with cognitive impairment want to see retired permanently. Not only is it offensive, but it actively holds back early diagnosis, effective care, and faster research progress towards new, life-changing treatments.

advertisement

I’ve felt the stigma firsthand. On more than one occasion, health care professionals or researchers referred to me as “demented” or “having dementia.” Those utterances made me feel hurt, judged, and counted out. And it certainly didn’t make me want to return to that doctor’s office or take part in a long, demanding clinical trial.

Sentiments like that make a difficult problem even more intractable. According to a recent survey , 8 in 10 people over age 65 associated stigma with the word “dementia.” If it’s already difficult to get people to seek treatment earlier, why use language that makes it even harder?

Related: Survey: Neurologists split on prescribing new Alzheimer’s treatments

No one should feel unwelcome or criticized in their doctor’s office. People should be able to seek help for memory problems or cognitive impairment, knowing they’ll be treated with dignity and respect. It’s in the Alzheimer’s community’s power to make sure that happens.

That’s why I’m eager to announce the Initiative to Change the “D-Word.” This nonprofit organization is bringing together a coalition of advocates, professional societies, companies, and policymakers to work to eliminate the words dementia and demented from the lexicon of Alzheimer’s disease and cognitive impairment, reducing stigma and increasing accuracy around these conditions.

The first goal of the initiative is to raise overall public awareness about the stigma around the “D-word” and the need for using alternative language. Just ask those at greatest risk. In the survey I mentioned earlier, almost half of older adults pictured someone with dementia as having difficulty performing everyday tasks, and the top three emotions associated with dementia were confusion, helplessness, and isolation.

It’s essential to educate providers and researchers about this effort. People often hesitate to seek help for changes with thinking or memory because they fear how a doctor or other clinician might treat them. It’s part of the reason why people delay seeing a doctor for multiple years, leading to a delayed diagnosis. At every step, the stigma of “dementia” throws up barriers.

There are many causes of cognitive impairment, from Alzheimer’s disease to depression and diminished blood flow to and in the brain. “Dementia” covers up these key differences with a single catch-all term. That’s simply not helpful. Clinicians, researchers, and others need to use more precise language that equips people to move forward, rather than holding them back with a term from yesteryear.

When a diagnosis is delayed or inaccurate, an individual has fewer chances to benefit from treatment, take part in a clinical trial, or fully benefit from innovations in treatment options. Many of the newest drugs now available or in trials work best at the earliest stages of the disease. The challenge of finding volunteers in these early stages is a major reason that progress has been so slow, for so long.

Related: Primary care physicians should be at the heart of treating Alzheimer’s

I was fortunate to have received a relatively early Alzheimer’s diagnosis. Far from an ending, it opened a new chapter of my life. I got to know other people living with the disease. I got involved with the advocacy community. I learned that life doesn’t stop with a diagnosis; in fact, it’s just the opposite.

Millions of Americans deserve these same benefits. To make that happen, the journey toward diagnosis needs to be as easy as possible. And that depends on a shared effort to break down stigma so people can get the support and care they need, find trial opportunities, and approach their next steps feeling empowered and well-informed.

Changing how we talk can change how we fight Alzheimer’s disease and cognitive impairment — unleashing earlier diagnosis, better care, and improved lives for people with these conditions.

Mike Zuendel founded the “Initiative to Change the ‘D-Word'” and has served on the Alzheimer’s Association’s Early-Stage Advisors Group, the Global Neurosciences Institute patient council, is a champion for the Voices of Alzheimer’s organization, and sits on the Alzheimer’s patient advisory board of the Center for Study on Clinical Research Participation and the board of directors of the Banner Alzheimer’s Foundation.

LETTER TO THE EDITOR

Have an opinion on this essay submit a letter to the editor here ., about the author reprints, mike zuendel.

Alzheimer’s

STAT encourages you to share your voice. We welcome your commentary, criticism, and expertise on our subscriber-only platform, STAT+ Connect

To submit a correction request, please visit our Contact Us page .

survey and latest research articles

Recommended

survey and latest research articles

Recommended Stories

survey and latest research articles

Congress is unprepared for the post-Chevron world. It needs help from subject matter experts

survey and latest research articles

How the restaurant drama ‘The Bear’ mirrors working in a hospital

survey and latest research articles

STAT Plus: Health Care's Colossus: How UnitedHealth harnesses its physician empire to squeeze profits out of patients

survey and latest research articles

Mount Sinai mounted aggressive campaign to stifle debate over revelations about its controversial brain research

survey and latest research articles

STAT Plus: Sarepta demanded Duchenne patient advocacy group censor video critical of the company

survey and latest research articles

  • Work & Careers
  • Life & Arts

Novo Nordisk spends record amounts on research to fend off weight-loss rivals

To read this article for free register for ft edit now.

Once registered, you can:

  • Read this article and many more, free for 30 days with no card details required
  • Enjoy 8 thought-provoking articles a day chosen for you by senior editors
  • Download the award-winning FT Edit app to access audio, saved articles and more
  • Global news & analysis
  • Expert opinion
  • Special features
  • FirstFT newsletter
  • Videos & Podcasts
  • Android & iOS app
  • FT Edit app
  • 10 gift articles per month

Explore our full range of subscriptions.

Why the ft.

See why over a million readers pay to read the Financial Times.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • J Korean Med Sci
  • v.35(45); 2020 Nov 23

Logo of jkms

Reporting Survey Based Studies – a Primer for Authors

Prithvi sanjeevkumar gaur.

1 Smt. Kashibai Navale Medical College and General Hospital, Pune, India.

Olena Zimba

2 Department of Internal Medicine No. 2, Danylo Halytsky Lviv National Medical University, Lviv, Ukraine.

Vikas Agarwal

3 Department Clinical Immunology and Rheumatology, Sanjay Gandhi Postgraduate Institute of Medical Sciences, Lucknow, India.

Latika Gupta

Associated data.

The coronavirus disease 2019 (COVID-19) pandemic has led to a massive rise in survey-based research. The paucity of perspicuous guidelines for conducting surveys may pose a challenge to the conduct of ethical, valid and meticulous research. The aim of this paper is to guide authors aiming to publish in scholarly journals regarding the methods and means to carry out surveys for valid outcomes. The paper outlines the various aspects, from planning, execution and dissemination of surveys followed by the data analysis and choosing target journals. While providing a comprehensive understanding of the scenarios most conducive to carrying out a survey, the role of ethical approval, survey validation and pilot testing, this brief delves deeper into the survey designs, methods of dissemination, the ways to secure and maintain data anonymity, the various analytical approaches, the reporting techniques and the process of choosing the appropriate journal. Further, the authors analyze retracted survey-based studies and the reasons for the same. This review article intends to guide authors to improve the quality of survey-based research by describing the essential tools and means to do the same with the hope to improve the utility of such studies.

Graphical Abstract

An external file that holds a picture, illustration, etc.
Object name is jkms-35-e398-abf001.jpg

INTRODUCTION

Surveys are the principal method used to address topics that require individual self-report about beliefs, knowledge, attitudes, opinions or satisfaction, which cannot be assessed using other approaches. 1 This research method allows information to be collected by asking a set of questions on a specific topic to a subset of people and generalizing the results to a larger population. Assessment of opinions in a valid and reliable way require clear, structured and precise reporting of results. This is possible with a survey based out of a meticulous design, followed by validation and pilot testing. 2 The aim of this opinion piece is to provide practical advice to conduct survey-based research. It details the ethical and methodological aspects to be undertaken while performing a survey, the online platforms available for distributing survey, and the implications of survey-based research.

Survey-based research is a means to obtain quick data, and such studies are relatively easy to conduct and analyse, and are cost-effective (under a majority of the circumstances). 3 These are also one of the most convenient methods of obtaining data about rare diseases. 4 With major technological advancements and improved global interconnectivity, especially during the coronavirus disease 2019 (COVID-19) pandemic, surveys have surpassed other means of research due to their distinctive advantage of a wider reach, including respondents from various parts of the world having diverse cultures and geographically disparate locations. Moreover, survey-based research allows flexibility to the investigator and respondent alike. 5 While the investigator(s) may tailor the survey dates and duration as per their availability, the respondents are allowed the convenience of responding to the survey at ease, in the comfort of their homes, and at a time when they can answer the questions with greater focus and to the best of their abilities. 6 Respondent biases inherent to environmental stressors can be significantly reduced by this approach. 5 It also allows responses across time-zones, which may be a major impediment to other forms of research or data-collection. This allows distant placement of the investigator from the respondents.

Various digital tools are now available for designing surveys ( Table 1 ). 7 Most of these are free with separate premium paid options. The analysis of data can be made simpler and cleaning process almost obsolete by minimising open-ended answer choices. 8 Close-ended answers makes data collection and analysis efficient, by generating an excel which can be directly accessed and analysed. 9 Minimizing the number of questions and making all questions mandatory can further aid this process by bringing uniformity to the responses and analysis simpler. Surveys are arguably also the most engaging form of research, conditional to the skill of the investigator.

Serial No.Survey toolFeatures
FreePaid
1SoGoSurveyPre-defined templates, multilingual surveys, skip logic, questions and answer bank, progress bar, add comments, import answers, embed multimedia, print surveys.Advanced reporting and analysis, pre-fill known data into visible and hidden field, automatic scoring, display custom messages based on quiz scores.
2Typeform3 Typeforms, 10 Q/t, 100 A/m, templates, reports and metrics, embed typeform in a webpage, download data.10,000 A/m, unlimited logic jumps, remove typeform branding, payment fields, scoring and pricing calculator, send follow up emails.
3Zoho SurveyUnlimited surveys,10 Q/s, 100 A/s, in-mail surveys, templates, embed in website, scoring, HTTPS encryption, social media promotion, password protection, 1 response collector, Survey builder in 26 languages.Unlimited questions and respondents and response collectors, question randomization, Zoho CRM, Eventbrite, Slack, Google sheets, Shopify and Zendesk integration, Sentiment analysis, Piping logic, White label survey, Upload favicon, Tableau integration.
4YesinsightsNA25,000 A/m, NPS surveys, Website Widget, Unlimited surveys and responses.
5Survey PlanetUnlimited surveys, questions and responses, two survey player types, share surveys on social media and emails, SSL security, no data-mining or information selling, embed data, pre-written surveys, basic themes, surveys in 20 languages, basic in-app reports.Export results, custom themes, question branching and images with custom formatting, alternative success URL redirect, white label and kiosk surveys, e-mail survey completion notifications four chart types for results.
6Survey Gizmo3 surveys, unlimited Q/s, 100 A, raw data exports, share reports via URL, various question and answer options, progress bar and share on social media options.Advanced reports (profile, longitudinal), logic and piping, A/B split testing, disqualifications, file uploads, API access, webpage redirects, conjoint analysis, crosstab reports, TURF reports, open-text analysis, data-cleaning tool.
7SurveyMonkey10 questions, 100 respondents, 15 question types, light theme customization and templates.Unlimited, multilingual questions and surveys, fine control systems, analyse, filter and export results, shared asset library, customised logos, colours and URLs.
8SurveyLegend3 surveys, 6 pictures, unlimited responses, real time analytics, no data export, 1 conditional logic, Ads and watermarked, top notch security and encryption, collect on any device.Unlimited surveys, responses, pictures, unlimited conditional logic, white label, share real time results, enable data export, 100K API calls and 10GB storage.
9Google formsUnlimited surveys and respondents, data collection in google spreadsheets, themes, custom logo, add images or videos, skip logic and page branching, embed survey into emails or website, add collaborators.NA
10Client HeartbeatNAUnlimited Surveys, 50 + Users, 10,000 + Contacts, 10 Sub-Accounts, CRM syncing/API access, Company branding, Concierge Support.

Q/t = questions per typeform, A/m = answers per month, Q/s = questions per survey, A/s = answers per survey, NA = not applicable, NPS = net promoter score.

Data protection laws now mandate anonymity while collecting data for most surveys, particularly when they are exempt from ethical review. 10 , 11 Anonymization has the potential to reduce (or at times even eliminate) social desirability bias which gains particular relevance when targeting responses from socially isolated or vulnerable communities (e.g. LGBTQ and low socio-economic strata communities) or minority groups (religious, ethnic and medical) or controversial topics (drug abuse, using language editing software).

Moreover, surveys could be the primary methodology to explore a hypothesis until it evolves into a more sophisticated and partly validated idea after which it can be probed further in a systematic and structured manner using other research methods.

The aim of this paper is to reduce the incorrect reporting of surveys. The paper also intends to inform researchers of the various aspects of survey-based studies and the multiple points that need to be taken under consideration while conducting survey-based research.

SURVEYS IN THE COVID-19 PANDEMIC

The COVID-19 has led to a distinctive rise in survey-based research. 12 The need to socially distance amid widespread lockdowns reduced patient visits to the hospital and brought most other forms of research to a standstill in the early pandemic period. A large number of level-3 bio-safety laboratories are being engaged for research pertaining to COVID-19, thereby limiting the options to conduct laboratory-based research. 13 , 14 Therefore, surveys appear to be the most viable option for researchers to explore hypotheses related to the situation and its impact in such times. 15

LIMITATIONS WHILE CONDUCTING SURVEY-BASED RESEARCH

Designing a fine survey is an arduous task and requires skill even though clear guidelines are available in regard to the same. Survey design requires extensive thoughtfulness on the core questions (based on the hypothesis or the primary research question), with consideration of all possible answers, and the inclusion of open-ended options to allow recording other possibilities. A survey should be robust, in regard to the questions gathered and the answer choices available, it must be validated, and pilot tested. 16 The survey design may be supplanted with answer choices tailored for the convenience of the responder, to reduce the effort while making it more engaging. Survey dissemination and engagement of respondents also requires experience and skill. 17

Furthermore, the absence of an interviewer prevents us from gaining clarification on responses of open-ended questions if any. Internet surveys are also prone to survey fraud by erroneous reporting. Hence, anonymity of surveys is a boon and a bane. The sample sizes are skewed as it lacks representation of population absent on the Internet like the senile or the underprivileged. The illiterate population also lacks representation in survey-based research.

The “Enhancing the QUAlity and Transparency Of health Research” network (EQUATOR) provides two separate guidelines replete with checklists to ensure valid reporting of e-survey methodology. These include “The Checklist for Reporting Results of Internet E-Surveys” (CHERRIES) statement and “ The Journal of Medical Internet Research ” (JMIR) checklist.

COMMON TYPES OF SURVEY-BASED RESEARCH

From a clinician's standpoint, the common survey types include those centered around problems faced by the patients or physicians. 18 Surveys collecting the opinions of various clinicians on a debated clinical topic or feedback forms typically served after attending medical conferences or prescribing a new drug or trying a new method for a given procedure are also surveys. The formulation of clinical practice guidelines entails Delphi exercises using paper surveys, which are yet another form of survey-mediated research.

Size of the survey depends on its intent. They could be large or small surveys. Therefore, identification of the intent behind the survey is essential to allow the investigator to form a hypothesis and then explore it further. Large population-based or provider-based surveys are often done and generate mammoth data over the years. E.g. The National Health and Nutrition Examination Survey, The National Health Interview Survey and the National Ambulatory Medical Care Survey.

SCENARIOS FOR CONDUCTING SURVEY-BASED RESEARCH

Despite all said and done about the convenience of conducting survey-based research, it is prudent to conduct a feasibility check before embarking on one. Certain scenarios may be the key determinants in determining the fate of survey-based research ( Table 2 ).

Unsuitable scenariosSuitable scenarios
Respondent relatedRespondent related
1. Avid Internet users are ideal target demographics.
2. Email database makes reminders convenient.
3. Enthusiastic target demographics nullifies need of incentives.
4. Supports a larger sample size.
5. Non-respondents and respondents must be matched.
1. Under-represented on the internet can't be included.
2. Population with privacy concerns like transgenders, sex workers or rape survivors need to be promised anonymity.
3. People lacking motivation and enthusiasm, require coaxing and convincing by the physician or incentives as a last resort.
4. Illiterate population unable to read and comprehend the questions asked.
Investigator relatedInvestigator related
1. Adequate budget for survey dissemination.
2. Well-versed with handling all software required for the survey.
3. Able to monitor IP address and cookies to avoid multiple responses.
4. Surveys undergo pilot testing, validation testing and reliability testing.
5. Allowing data entry without data editing.
1. The investigator is a novice at or inexperienced with web-based tools.
Survey relatedSurvey related
1. Engaging and interactive using the various tools.
2. Fast evolving content in repeated succession to keep the respondent alert. E.g. - Delphi surveys.
3. Suitable to record rare, strange events that later help to develop a hypothesis.
1. Need of accurate and precise data or observational data.
2. An existing study has already validated key observations (door-to-door study has already been conducted).
3. Qualitative data is being studied.

ETHICS APPROVAL FOR SURVEY-BASED RESEARCH

Approval from the Institutional Review Board should be taken as per requirement according to the CHERRIES checklist. However, rules for approval are different as per the country or nation and therefore, local rules must be checked and followed. For instance, in India, the Indian Council of Medical Research released an article in 2017, stating that the concept of broad consent has been updated which is defined “consent for an unspecified range of future research subject to a few contents and/or process restrictions.” It talks about “the flexibility of Indian ethics committees to review a multicentric study proposal for research involving low or minimal risk, survey or studies using anonymized samples or data or low or minimal risk public health research.” The reporting of approvals received and applied for and the procedure of written, informed consent followed must be clear and transparent. 10 , 19

The use of incentives in surveys is also an ethical concern. 20 The different of incentives that can be used are monetary or non-monetary. Monetary incentives are usually discouraged as these may attract the wrong population due to the temptation of the monetary benefit. However, monetary incentives have been seen to make survey receive greater traction even though this is yet to proven. Monetary incentives are not only provided in terms of cash or cheque but also in the form of free articles, discount coupons, phone cards, e-money or cashback value. 21 These methods though tempting must be seldom used. If used, their use must be disclosed and justified in the report. The use of non-monetary incentives like a meeting with a famous personality or access to restricted and authorized areas. These can also help pique the interest of the respondents.

DESIGNING A SURVEY

As mentioned earlier, the design of a survey is reflective of the skill of the investigator curating it. 22 Survey builders can be used to design an efficient survey. These offer majority of the basic features needed to construct a survey, free of charge. Therefore, surveys can be designed from scratch, using pre-designed templates or by using previous survey designs as inspiration. Taking surveys could be made convenient by using the various aids available ( Table 1 ). Moreover, even the investigator should be mindful of the unintended response effects of ordering and context of survey questions. 23

Surveys using clear, unambiguous, simple and well-articulated language record precise answers. 24 A well-designed survey accounts for the culture, language and convenience of the target demographic. The age, region, country and occupation of the target population is also considered before constructing a survey. Consistency is maintained in the terms used in the survey and abbreviations are avoided to allow the respondents to have a clear understanding of the question being answered. Universal abbreviations or previously indexed abbreviations maintain the unambiguity of the survey.

Surveys beginning with broad, easy and non-specific questions as compared to sensitive, tedious and non-specific ones receive more accurate and complete answers. 25 Questionnaires designed such that the relatively tedious and long questions requiring the respondent to do some nit-picking are placed at the end improves the response rate of the survey. This prevents the respondent to be discouraged to answer the survey at the beginning itself and motivates the respondent to finish the survey at the end. All questions must provide a non-response option and all questions should be made mandatory to increase completeness of the survey. Questions can be framed in close-ended or open-ended fashion. However, close-ended questions are easier to analyze and are less tedious to answer by the respondent and therefore must be the main component in a survey. Open-ended questions have minimal use as they are tedious, take time to answer and require fine articulation of one's thoughts. Also, their minimal use is advocated because the interpretation of such answers requires dedication in terms of time and energy due to the diverse nature of the responses which is difficult to promise owing to the large sample sizes. 26 However, whenever the closed choices do not cover all probabilities, an open answer choice must be added. 27 , 28

Screening questions to meet certain criteria to gain access to the survey in cases where inclusion criteria need to be established to maintain authenticity of target demographic. Similarly, logic function can be used to apply an exclusion. This allows clean and clear record of responses and makes the job of an investigator easier. The respondents can or cannot have the option to return to the previous page or question to alter their answer as per the investigator's preference.

The range of responses received can be reduced in case of questions directed towards the feelings or opinions of people by using slider scales, or a Likert scale. 29 , 30 In questions having multiple answers, check boxes are efficient. When a large number of answers are possible, dropdown menus reduce the arduousness. 31 Matrix scales can be used to answer questions requiring grading or having a similar range of answers for multiple conditions. Maximum respondent participation and complete survey responses can be ensured by reducing the survey time. Quiz mode or weighted modes allow the respondent to shuffle between questions and allows scoring of quizzes and can be used to complement other weighted scoring systems. 32 A flowchart depicting a survey construct is presented as Fig. 1 .

An external file that holds a picture, illustration, etc.
Object name is jkms-35-e398-g001.jpg

Survey validation

Validation testing though tedious and meticulous, is worthy effort as the accuracy of a survey is determined by its validity. It is indicative of the of the sample of the survey and the specificity of the questions such that the data acquired is streamlined to answer the questions being posed or to determine a hypothesis. 33 , 34 Face validation determines the mannerism of construction of questions such that necessary data is collected. Content validation determines the relation of the topic being addressed and its related areas with the questions being asked. Internal validation makes sure that the questions being posed are directed towards the outcome of the survey. Finally, Test – retest validation determines the stability of questions over a period of time by testing the questionnaire twice and maintaining a time interval between the two tests. For surveys determining knowledge of respondents pertaining to a certain subject, it is advised to have a panel of experts for undertaking the validation process. 2 , 35

Reliability testing

If the questions in the survey are posed in a manner so as to elicit the same or similar response from the respondents irrespective of the language or construction of the question, the survey is said to be reliable. It is thereby, a marker of the consistency of the survey. This stands to be of considerable importance in knowledge-based researches where recall ability is tested by making the survey available for answering by the same participants at regular intervals. It can also be used to maintain authenticity of the survey, by varying the construction of the questions.

Designing a cover letter

A cover letter is the primary means of communication with the respondent, with the intent to introduce the respondent to the survey. A cover letter should include the purpose of the survey, details of those who are conducting it, including contact details in case clarifications are desired. It should also clearly depict the action required by the respondent. Data anonymization may be crucial to many respondents and is their right. This should be respected in a clear description of the data handling process while disseminating the survey. A good cover letter is the key to building trust with the respondent population and can be the forerunner to better response rates. Imparting a sense of purpose is vital to ideationally incentivize the respondent population. 36 , 37 Adding the credentials of the team conducting the survey may further aid the process. It is seen that an advance intimation of the survey prepares the respondents while improving their compliance.

The design of a cover letter needs much attention. It should be captivating, clear, precise and use a vocabulary and language specific to the target population for the survey. Active voice should be used to make a greater impact. Crowding of the details must be avoided. Using italics, bold fonts or underlining may be used to highlight critical information. the tone ought to be polite, respectful, and grateful in advance. The use of capital letters is at best avoided, as it is surrogate for shouting in verbal speech and may impart a bad taste.

The dates of the survey may be intimated, so the respondents may prepare themselves for taking it at a time conducive to them. While, emailing a closed group in a convenience sampled survey, using the name of the addressee may impart a customized experience and enhance trust building and possibly compliance. Appropriate use of salutations like Mr./Ms./Mrs. may be considered. Various portals such as SurveyMonkey allow the researchers to save an address list on the website. These may then be reached out using an embedded survey link from a verified email address to minimize bouncing back of emails.

The body of the cover letter must be short, crisp and not exceed 2–3 paragraphs under idea circumstances. Ernest efforts to protect confidentiality may go a long way in enhancing response rates. 38 While it is enticing to provide incentives to enhance response, these are best avoided. 38 , 39 In cases when indirect incentives are offered, such as provision of results of the survey, these may be clearly stated in the cover letter. Lastly, a formal closing note with the signatures of the lead investigator are welcome. 38 , 40

Designing questions

Well-constructed questionnaires are essentially the backbone of successful survey-based studies. With this type of research, the primary concern is the adequate promotion and dissemination of the questionnaire to the target population. The careful of selection of sample population, therefore, needs to be with minimal flaws. The method of conducting survey is an essential determinant of the response rate observed. 41 Broadly, surveys are of two types: closed and open. Depending on the sample population the method of conducting the survey must be determined.

Various doctors use their own patients as the target demographic, as it improves compliance. However, this is effective in surveys aiming towards a geographically specific, fairly common disease as the sample size needs to be adequate. Response bias can be identified by the data collected from respondent and non-respondent groups. 42 , 43 Therefore, to choose a target population whose database of baseline characteristics is already known is more efficacious. In cases of surveys focused on patients having a rare group of diseases, online surveys or e-surveys can be conducted. Data can also be gathered from the multiple national organizations and societies all over the world. 44 , 45 Computer generated random selection can be done from this data to choose participants and they can be reached out to using emails or social media platforms like WhatsApp and LinkedIn. In both these scenarios, closed questionnaires can be conducted. These have restricted access either through a URL link or through e-mail.

In surveys targeting an issue faced by a larger demographic (e.g. pandemics like the COVID-19, flu vaccines and socio-political scenarios), open surveys seem like the more viable option as they can be easily accessed by majority of the public and ensures large number of responses, thereby increasing the accuracy of the study. Survey length should be optimal to avoid poor response rates. 25 , 46

SURVEY DISSEMINATION

Uniform distribution of the survey ensures equitable opportunity to the entire target population to access the questionnaire and participate in it. While deciding the target demographic communities should be studied and the process of “lurking” is sometimes practiced. Multiple sampling methods are available ( Fig. 1 ). 47

Distribution of survey to the target demographic could be done using emails. Even though e-mails reach a large proportion of the target population, an unknown sender could be blocked, making the use of personal or a previously used email preferable for correspondence. Adding a cover letter along with the invite adds a personal touch and is hence, advisable. Some platforms allow the sender to link the survey portal with the sender's email after verifying it. Noteworthily, despite repeated email reminders, personal communication over the phone or instant messaging improved responses in the authors' experience. 48 , 49

Distribution of the survey over other social media platforms (SMPs, namely WhatsApp, Facebook, Instagram, Twitter, LinkedIn etc.) is also practiced. 50 , 51 , 52 Surveys distributed on every available platform ensures maximal outreach. 53 Other smartphone apps can also be used for wider survey dissemination. 50 , 54 It is important to be mindful of the target population while choosing the platform for dissemination of the survey as some SMPs such as WhatsApp are more popular in India, while others like WeChat are used more widely in China, and similarly Facebook among the European population. Professional accounts or popular social accounts can be used to promote and increase the outreach for a survey. 55 Incentives such as internet giveaways or meet and greets with their favorite social media influencer have been used to motivate people to participate.

However, social-media platforms do not allow calculation of the denominator of the target population, resulting in inability to gather the accurate response rate. Moreover, this method of collecting data may result in a respondent bias inherent to a community that has a greater online presence. 43 The inability to gather the demographics of the non-respondents (in a bid to identify and prove that they were no different from respondents) can be another challenge in convenience sampling, unlike in cohort-based studies.

Lastly, manually filling of surveys, over the telephone, by narrating the questions and answer choices to the respondents is used as the last-ditch resort to achieve a high desired response rate. 56 Studies reveal that surveys released on Mondays, Fridays, and Sundays receive more traction. Also, reminders set at regular intervals of time help receive more responses. Data collection can be improved in collaborative research by syncing surveys to fill out electronic case record forms. 57 , 58 , 59

Data anonymity refers to the protection of data received as a part of the survey. This data must be stored and handled in accordance with the patient privacy rights/privacy protection laws in reference to surveys. Ethically, the data must be received on a single source file handled by one individual. Sharing or publishing this data on any public platform is considered a breach of the patient's privacy. 11 In convenience sampled surveys conducted by e-mailing a predesignated group, the emails shall remain confidential, as inadvertent sharing of these as supplementary data in the manuscript may amount to a violation of the ethical standards. 60 A completely anonymized e-survey discourages collection of Internet protocol addresses in addition to other patient details such as names and emails.

Data anonymity gives the respondent the confidence to be candid and answer the survey without inhibitions. This is especially apparent in minority groups or communities facing societal bias (sex workers, transgenders, lower caste communities, women). Data anonymity aids in giving the respondents/participants respite regarding their privacy. As the respondents play a primary role in data collection, data anonymity plays a vital role in survey-based research.

DATA HANDLING OF SURVEYS

The data collected from the survey responses are compiled in a .xls, .csv or .xlxs format by the survey tool itself. The data can be viewed during the survey duration or after its completion. To ensure data anonymity, minimal number of people should have access to these results. The data should then be sifted through to invalidate false, incorrect or incomplete data. The relevant and complete data should then be analyzed qualitatively and quantitatively, as per the aim of the study. Statistical aids like pie charts, graphs and data tables can be used to report relative data.

ANALYSIS OF SURVEY DATA

Analysis of the responses recorded is done after the time made available to answer the survey is complete. This ensures that statistical and hypothetical conclusions are established after careful study of the entire database. Incomplete and complete answers can be used to make analysis conditional on the study. Survey-based studies require careful consideration of various aspects of the survey such as the time required to complete the survey. 61 Cut-off points in the time frame allow authentic answers to be recorded and analyzed as compared to disingenuous completed questionnaires. Methods of handling incomplete questionnaires and atypical timestamps must be pre-decided to maintain consistency. Since, surveys are the only way to reach people especially during the COVID-19 pandemic, disingenuous survey practices must not be followed as these will later be used to form a preliminary hypothesis.

REPORTING SURVEY-BASED RESEARCH

Reporting the survey-based research is by far the most challenging part of this method. A well-reported survey-based study is a comprehensive report covering all the aspects of conducting a survey-based research.

The design of the survey mentioning the target demographic, sample size, language, type, methodology of the survey and the inclusion-exclusion criteria followed comprises a descriptive report of a survey-based study. Details regarding the conduction of pilot-testing, validation testing, reliability testing and user-interface testing add value to the report and supports the data and analysis. Measures taken to prevent bias and ensure consistency and precision are key inclusions in a report. The report usually mentions approvals received, if any, along with the written, informed, consent taken from the participants to use the data received for research purposes. It also gives detailed accounts of the different distribution and promotional methods followed.

A detailed account of the data input and collection methods along with tools used to maintain the anonymity of the participants and the steps taken to ensure singular participation from individual respondents indicate a well-structured report. Descriptive information of the website used, visitors received and the externally influencing factors of the survey is included. Detailed reporting of the post-survey analysis including the number of analysts involved, data cleaning required, if any, statistical analysis done and the probable hypothesis concluded is a key feature of a well-reported survey-based research. Methods used to do statistical corrections, if used, should be included in the report. The EQUATOR network has two checklists, “The Checklist for Reporting Results of Internet E-Surveys” (CHERRIES) statement and “ The Journal of Medical Internet Research ” (JMIR) checklist, that can be utilized to construct a well-framed report. 62 , 63 Importantly, self-reporting of biases and errors avoids the carrying forward of false hypothesis as a basis of more advanced research. References should be cited using standard recommendations, and guided by the journal specifications. 64

CHOOSING A TARGET JOURNAL FOR SURVEY-BASED RESEARCH

Surveys can be published as original articles, brief reports or as a letter to the editor. Interestingly, most modern journals do not actively make mention of surveys in the instructions to the author. Thus, depending on the study design, the authors may choose the article category, cohort or case-control interview or survey-based study. It is prudent to mention the type of study in the title. Titles albeit not too long, should not exceed 10–12 words, and may feature the type of study design for clarity after a semicolon for greater citation potential.

While the choice of journal is largely based on the study subject and left to the authors discretion, it may be worthwhile exploring trends in a journal archive before proceeding with submission. 65 Although the article format is similar across most journals, specific rules relevant to the target journal may be followed for drafting the article structure before submission.

RETRACTION OF ARTICLES

Articles that are removed from the publication after being released are retracted articles. These are usually retracted when new discrepancies come to light regarding, the methodology followed, plagiarism, incorrect statistical analysis, inappropriate authorship, fake peer review, fake reporting and such. 66 A sufficient increase in such papers has been noticed. 67

We carried out a search of “surveys” on Retraction Watch on 31st August 2020 and received 81 search results published between November 2006 to June 2020, out of which 3 were repeated. Out of the 78 results, 37 (47.4%) articles were surveys, 23 (29.4%) showed as unknown types and 18 (23.2%) reported other types of research. ( Supplementary Table 1 ). Fig. 2 gives a detailed description of the causes of retraction of the surveys we found and its geographic distribution.

An external file that holds a picture, illustration, etc.
Object name is jkms-35-e398-g002.jpg

A good survey ought to be designed with a clear objective, the design being precise and focused with close-ended questions and all probabilities included. Use of rating scales, multiple choice questions and checkboxes and maintaining a logical question sequence engages the respondent while simplifying data entry and analysis for the investigator. Conducting pilot-testing is vital to identify and rectify deficiencies in the survey design and answer choices. The target demographic should be defined well, and invitations sent accordingly, with periodic reminders as appropriate. While reporting the survey, maintaining transparency in the methods employed and clearly stating the shortcomings and biases to prevent advocating an invalid hypothesis.

Disclosure: The authors have no potential conflicts of interest to disclose.

Author Contributions:

  • Conceptualization: Gaur PS, Zimba O, Agarwal V, Gupta L.
  • Visualization: Gaur PS, Zimba O, Agarwal V, Gupta L.
  • Writing - original draft: Gaur PS, Gupta L.

SUPPLEMENTARY MATERIAL

Reporting survey based research

  • Share full article

Advertisement

Supported by

Whitney Museum Announces 2026 Biennial Curators

Marcela Guerrero and Drew Sawyer will organize the 82nd edition of the museum’s signature survey of contemporary American art.

A man with a beard in a black suit, his hands in his pockets, stands alongside a woman wearing a blue dress, open at the neck, and giant hoop earrings.

By Zachary Small

The Whitney Museum of American Art has named two curators from its ranks to organize the next edition of its Biennial, what is arguably the most important art show in the United States. The 2026 exhibition will be helmed by Marcela Guerrero , who is known for elevating the work of Latin American artists, and Drew Sawyer, a photography curator who joined the institution last year from the Brooklyn Museum.

“This is a huge endeavor,” said Guerrero, 44, who also said that she was excited to partner with Sawyer, 42. “He has a great eye; great discipline.”

Both curators have received praise for organizing exhibitions that quickly became must-sees in New York. Guerrero was promoted last year from an assistant curatorial position after her exhibition on Puerto Rican art in the wake of Hurricane Maria. (The show was billed as the first scholarly exhibition on Puerto Rican art at a major American institution in nearly 50 years.)

Sawyer recently finished his tenure at the Brooklyn Museum with “Copy Machine Manifestos: Artists Who Make Zines,” an exhibition about zine culture that he organized with the art historian Branden Joseph. The art critic Martha Schwendener called the show “an echo from the past, but also a blueprint for future generations of artists and dissidents.” Sawyer has curated his first show at the Whitney, opening Aug. 24: “Mark Armijo McKnight: Decreation,” which features this emerging photographer’s black and white prints of anonymous nude people in stark landscapes.

Scott Rothkopf, the director of the Whitney, said that both curators excel at finding new voices. “I see the Biennial more and more as an engine that moves the whole museum forward,” he said in an interview. “They are both fantastic talent scouts who think broadly and are able to synthesize really interesting strands of contemporary art.”

The modern format of the Whitney Biennial started in 1973, though the museum has produced regular surveys of contemporary art since 1937. It has become a defining feature of the institution, playing its role as a cultural barometer and occasional lightning rod .

We are having trouble retrieving the article content.

Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode please exit and  log into  your Times account, or  subscribe  for all of The Times.

Thank you for your patience while we verify access.

Already a subscriber?  Log in .

Want all of The Times?  Subscribe .

U.S. flag

An official website of the United States government

Here's how you know

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Home

  •   Facebook
  •   Twitter
  •   Linkedin
  •   Digg
  •   Reddit
  •   Pinterest
  •   Email

Latest Earthquakes |    Chat Share Social Media  

Records of fleas (Siphonaptera) from Delaware

We present an annotated checklist of fleas (Siphonaptera) known to occur in the state of Delaware based on an examination of Siphonaptera collections at the University of Delaware and the Carnegie Museum of Natural History, as well as new specimens of fleas we collected from wildlife, other hosts, and tick flags. We review published records and compile them herein with our new records, which include 3 species previously unreported from Delaware. With these additions, there are now 18 flea species from 19 avian and mammalian hosts documented from Delaware.

Citation Information

Publication Year 2024
Title Records of fleas (Siphonaptera) from Delaware
DOI
Authors A.C Kennedy, W.S. Winter, A.L. Gardner, Neal Woodman, S.A. Shifflett, S. Redus, J.R. Newcomer, R. P. Eckerlin
Publication Type Article
Publication Subtype Journal Article
Series Title Journal of Medical Entomology
Index ID
Record Source
USGS Organization Eastern Ecological Science Center

Related Content

Neal woodman, ph.d., research zoologist.

IMAGES

  1. Description of presented review with latest survey articles.

    survey and latest research articles

  2. Understanding the 3 Main Types of Survey Research & Putting Them to Use

    survey and latest research articles

  3. How to write survey questions for research

    survey and latest research articles

  4. (PDF) How to Write an Original Research Article: A Guide for

    survey and latest research articles

  5. How to write a survey research paper. How to write better Survey Paper

    survey and latest research articles

  6. Survey Research with Examples

    survey and latest research articles

VIDEO

  1. All India People Poll Latest Survey On AP Elections 2024

  2. Naganna Latest Survey Report After YS Jagan ATTACK

  3. India TV Sensational Survey On AP Elections 2024

  4. AP Connect Latest Survey Report On AP Election 2024

  5. Pioneer Poll Strategies Latest Survey On AP Elections 2024 |Assembly Elections |Parliament Elections

  6. Survey Research/types and advantages of survey research

COMMENTS

  1. Advance articles

    Research Article 27 June 2023. Survey Consent to Administrative Data Linkage: Five Experiments on Wording and Format ... Latest Issue Only RSS Feed - Advance Articles RSS Feed - Open Access RSS Feed - Editor's Choice. Latest; Most Read; Most Cited; Innovating Web Probing: Comparing Written and Oral Answers to Open-Ended Probing Questions in a ...

  2. Journal of Survey Statistics and Methodology

    Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide. An official journal of the American Association for Public Opinion Research. Publishes cutting edge scholarly articles on statistical and methodological.

  3. High-Impact Articles

    High-Impact Articles. Journal of Survey Statistics and Methodology, sponsored by the American Association for Public Opinion Research and the American Statistical Association, began publishing in 2013.Its objective is to publish cutting edge scholarly articles on statistical and methodological issues for sample surveys, censuses, administrative record systems, and other related data.

  4. Surveys News, Research and Analysis

    Ray Block Jr, Penn State; Andrene Wright, Penn State, and Mia Angelica Powell, Penn State. A survey of more than 12,000 US voters found that Black Americans are among the most hopeful about the ...

  5. Survey response rates: Trends and a validity assessment framework

    However, a significant increase of number of surveys per article did occur from 2015 to 2020, t(458) = 5.22, p < .001, d = .49. In answering Research Question 8, we note no difference between the number of surveys per article from 2010 to 2015 and a jump in the average number of surveys per article from 1.27 in 2015 to 1.79 in 2020.

  6. Reducing respondents' perceptions of bias in survey research

    Survey research has become increasingly challenging. In many nations, response rates have continued a steady decline for decades, and the costs and time involved with collecting survey data have risen with it (Connelly et al., 2003; Curtin et al., 2005; Keeter et al., 2017).Still, social surveys are a cornerstone of social science research and are routinely used by the government and private ...

  7. The online survey as a qualitative research tool

    1. Beyond sole or primary method, there are a few instances of qualitative surveys being used in multi-method qualitative designs - essentially as a 'substitute' for, or to extend the reach of, interviews or focus groups as the primary data collection technique (e.g., Clarke & Demetriou, Citation 2016; Coyle & Rafalin, Citation 2001; Whelan, Citation 2007).

  8. Practices in Data-Quality Evaluation: A Large-Scale ...

    In this study, I examine data-quality evaluation methods in online surveys and their frequency of use. Drawing from survey-methodology literature, I identified 11 distinct assessment categories and analyzed their prevalence across 3,298 articles published in 2022 from 200 psychology journals in the Web of Science Master Journal List.

  9. AI and science: what 1,600 researchers think

    AI and science: what 1,600 researchers think. A Nature survey finds that scientists are concerned, as well as excited, by the increasing use of artificial-intelligence tools in research ...

  10. What surveys really say

    What surveys really say. Increasing the sample size of a survey is often thought to increase the accuracy of the results. However, an analysis of big surveys on the uptake of COVID-19 vaccines ...

  11. Statistics

    Statistics is the application of mathematical concepts to understanding and analysing large collections of data. A central tenet of statistics is to describe the variations in a data set or ...

  12. A critical look at online survey or questionnaire-based research

    Online survey or questionnaire-based studies collect information from participants responding to the study link using internet-based communication technology (e.g. E-mail, online survey platform). There has been a growing interest among researchers for using internet-based data collection methods during the COVID-19 pandemic, also reflected in ...

  13. Pew Research Center

    U.S. Surveys. Pew Research Center has deep roots in U.S. public opinion research. Launched as a project focused primarily on U.S. policy and politics in the early 1990s, the Center has grown over time to study a wide range of topics vital to explaining America to itself and to the world. ... Head of New Research Initiatives . Key facts about ...

  14. (PDF) Understanding and Evaluating Survey Research

    Survey research is defined as. "the collection of information from. a sample of individuals through their. responses to questions" (Check &. Schutt, 2012, p. 160). This type of r e -. search ...

  15. Designing, Conducting, and Reporting Survey Studies: A Primer for

    Burns et al., 2008 12. A guide for the design and conduct of self-administered surveys of clinicians. This guide includes statements on designing, conducting, and reporting web- and non-web-based surveys of clinicians' knowledge, attitude, and practice. The statements are based on a literature review, but not the Delphi method.

  16. Survey News, Articles

    Multimedia. Flow-based methods allow researchers to collect multiparameter data from individual cells in their samples, but the fate of samples depends on the instrument. The latest news and opinions in survey from The Scientist, the life science researcher's most trusted source of information.

  17. Understanding and Evaluating Survey Research

    Survey research is defined as "the collection of information from a sample of individuals through their responses to questions" ( Check & Schutt, 2012, p. 160 ). This type of research allows for a variety of methods to recruit participants, collect data, and utilize various methods of instrumentation. Survey research can use quantitative ...

  18. Survey Research

    Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout.

  19. List of issues Survey Review

    Browse the list of issues and latest articles from Survey Review. All issues Special issues Collections . Latest articles Volume 56 2024 Volume 55 2023 Volume 54 2022 Volume 53 2021 Volume 52 2020 ... Register to receive personalised research and resources by email. Sign me up.

  20. Are you considering the survey participant's experience?

    After a hiatus from participating in online panel surveys, researcher Ben Tolchinsky revisited the practice to assess changes in the survey experience since the mid-2000s. Joining 16 panels and taking hundreds of surveys on various devices, the author conducted a qualitative assessment to determine if the experience had improved. This article details the findings from Tolchinsky's survey ...

  21. Autonomous Cars and Consumer Choices: A Stated Preference ...

    This study aims to identify the factors that influence consumers' choice of autonomous cars in their new car purchase behavior. We surveyed stated preferences for autonomous driving technology and then analyzed the survey data using a multinominal logit model to examine how various factors—including price, consumer attributes, and respondents' evaluations of autonomous car characteristics ...

  22. Invest in team sports say 41% of experts with women's sport ...

    PwC unveiled its 8th Edition of its Global Sports Survey today, showcasing optimistic prospects for the business of sport. The survey, which gathered insights from 411 sports leaders worldwide, examined the current state of the sports industry and their expectations for the next three to five years.

  23. The State of Survey Methodology: Challenges, Dilemmas, and New

    The article begins where Dillman's 2002 American Association for Public Opinion Research Presidential Address ended, by discussing today's challenges, dilemmas, and opportunities for survey researchers and social scientists (Dillman 2002). When possible, we draw on the most current research to articulate our points; however, just as Dillman ...

  24. The Labor Market for Recent College Graduates

    At the New York Fed, our mission is to make the U.S. economy stronger and the financial system more stable for all segments of society. We do this by executing monetary policy, providing financial services, supervising banks and conducting research and providing expertise on issues that impact the nation and communities we serve.

  25. It's time to phase out 'dementia' from the Alzheimer's lexicon

    Not only is it offensive, but it actively holds back early diagnosis, effective care, and faster research progress towards new, life-changing treatments. ... According to a recent survey, 8 in 10 ...

  26. Novo Nordisk spends record amounts on research to ...

    Maker of Wegovy and Ozempic has 45,000 patients in clinical trials as it seeks to stave off threats from Eli Lilly and Roche Novo Nordisk spends record amounts on research to fend off weight-loss ...

  27. Reporting Survey Based Studies

    Abstract. The coronavirus disease 2019 (COVID-19) pandemic has led to a massive rise in survey-based research. The paucity of perspicuous guidelines for conducting surveys may pose a challenge to the conduct of ethical, valid and meticulous research. The aim of this paper is to guide authors aiming to publish in scholarly journals regarding the ...

  28. Whitney Museum Announces 2026 Biennial Curators

    18 Artists to Star at New J.F.K. Terminal: Terminal 6 at Kennedy International Airport will feature work by Charles Gaines, Barbara Kruger and more. Developers of new terminals must invest in ...

  29. Americans divided on support for Ukraine, Pew survey shows

    Americans are divided on whether the U.S. should keep funding Ukraine, according to a new Pew Research Center survey. The survey found 48 percent of Americans saying their country has "a ...

  30. Records of fleas (Siphonaptera) from Delaware

    We present an annotated checklist of fleas (Siphonaptera) known to occur in the state of Delaware based on an examination of Siphonaptera collections at the University of Delaware and the Carnegie Museum of Natural History, as well as new specimens of fleas we collected from wildlife, other hosts, and tick flags. We review published records and compile them herein with our new records, which inclu