Ukexperiencesquestionssexualidentity tcm77-181189
Ukexperiencesquestionssexualidentity tcm77-181189
Ukexperiencesquestionssexualidentity tcm77-181189
Peter Betts
Tables
Table 1 Surveys reviewed and suppliers of information ...............................................9
Table 2 Minimum and maximum percentage rates for each substantive sexual identity
category and type of missing data, by survey mode ............................................12
Table 3 Question stems and response categories .........................................................17
1
1 Executive Summary
Methodology: Ten public sector surveys were reviewed1. Their target populations
covered people of all sexual identities, though in some cases were limited
geographically or to people with certain characteristics. Information was obtained by
semi-structured telephone interviews with researchers at sponsoring organisations or
research and fieldwork contractors, and by further correspondence and reference to
survey reports and other materials. Information collected included survey and sexual
identity question designs, estimates produced, and feedback from respondents and
interviewers. A content analysis of the information obtained was conducted. A
summary of information collected (sample designs, modes of administration, question
designs and estimates) is at Appendix A.
Analysis & drawing conclusions: The review provided useful insight, generated
hypotheses and identified areas for research through the remainder of the project.
However confounding factors, such as the small number of surveys reviewed,
differing survey and question designs, small sample sizes, the uncertainty as to the
robustness of estimates and the lack of a reliable benchmark of the proportion of the
population which is lesbian, gay or bisexual (LGB), sometimes made comparisons
difficult. In this review terms such as ‘the LGB population’ and ‘heterosexual
1
Northern Ireland Life and Times Survey (2005); Policing For London Survey (2000); Employees’
Awareness, Knowledge and Exercise of Employment Rights Survey (2005); Fair Treatment at Work
Survey (2005); The National Mental Health and Ethnicity Census 2005 Service User Survey (2005);
The National Mental Health and Learning Disability Ethnicity Census (2006 ); British Social Attitudes
Survey (2005); Newham Household Panel Survey (2002-6); Scottish Census Small Scale Test (2005);
Civil Service Diversity Survey (2001). While this review was conducted the Adult Psychiatric
Morbidity Survey (APMS, 2006-7) and Citizenship Survey (2007-8) introduced questions on sexual
identity into the field – limited commentary on these is included.
2
respondents’ are used as shorthand for people who selected particular response
categories.
Survey estimates: Estimates were obtained for most of the surveys. Rates of the
proportion of respondents self-identifying as LGB ranged from 0.3% to 3.0%, lower
than the government estimate of LGB people constituting 5% to 7% of the population.
Concept being measured: All surveys asked a single question to capture sexual
identity, rather than asking separate questions about its different dimensions. Little
information was provided about the conceptual basis of the questions, or what
respondents understood them to be asking, and on what basis their answers were
given. A respondent’s answer might vary according to which dimension(s) they are
considering. ONS’s project will include qualitative research with members of the
public into the concept to be measured. It will address the feasibility of capturing
sexual identity in a single question, in the context of government social surveys with
varying purposes.
Only a few terms were used in the response categories, but there was great variety as
to which formal and colloquial alternatives were used and in which combinations: for
example, whether ‘heterosexual’ was used with or without ‘straight’, or ‘gay’ and
‘lesbian’ with or without ‘homosexual’. Definitions of categories were generally not
3
provided. Some respondent comprehension problems were apparent, particularly
regarding ‘heterosexual’. No effects on estimates resulting from the terms used or the
order of categories were obvious. The ONS qualitative work will investigate
understanding and use of terms, what are the appropriate response categories and the
need for definitions to be provided. The need for additional response categories, such
as ‘other’, ‘none of the above’ and alternatives to the categories commonly provided
will also be explored.
Missing data: All surveys recorded missing data, from item non-response (‘don’t
know’, refusal to an interviewer, or leaving a self-completion question unanswered),
or from response categories such as ‘do not wish to answer’, or from both. Higher
proportions of missing data tended to be found on self-administered surveys (totals
ranging from 1.4% to 25.0%) compared with interviewer-administered surveys (0.2%
to 9.0%). A possible issue was identified regarding potentially higher non-response
among minority ethnic groups due to cultural/religious beliefs which will be looked at
in the ONS development work.
Confidentiality: Surveys did not mention confidentiality at the point the question was
asked, though all gave general assurances elsewhere, for example in advance or
covering letters. Little information was gathered about respondents’ views on
confidentiality and fears of disclosure of their data to other parties; this will be
covered in the ONS research.
Privacy in the survey environment: There was little evidence of respondent concern
about privacy, that is, the interviewer or any other person present knowing which
answer had been given. On interviewer-administered surveys the common approach to
providing privacy and reducing embarrassment was use of concealed response show
cards, where respondents answered by giving a letter or number rather than stating
their answer in words. However no surveys prevented interviewers from knowing the
response category given by the respondent. Views on within-household privacy,
concurrent interviewing, and preferences for interviewer- or self-administered modes
will be explored in the ONS research.
Item response
Higher rates of missing data were obtained when questions were administered by self-
completion compared with interviewer-administered mode. This suggested that
perhaps there was an issue of invasion of privacy or objection to being asked about
sexual identity. Feedback from one postal survey supported this assumption, although
it found that objecting to being asked and objecting to giving an answer did not
necessarily correspond. Such concerns were less evident when the question was
interviewer-administered, where missing data was lower probably due to the fact that
respondents tend to be more compliant in interviewer-administered surveys.
4
Accuracy and completeness of data: When producing benchmark data, it is
important to minimise missing data particularly when the characteristic of interest is
found in a small proportion of the population. However it is also important that the
accuracy of substantive responses (that is, to the sexual identity categories including
other, or similar) is not compromised. This could occur if respondents feel pressured
into providing an answer which is incorrect but which may be considered more
socially acceptable. Mode of administration is an important consideration in relation
to this.
LGB rates across the modes appeared to be more consistent than those for
heterosexual respondents. So it might be that LGB respondents tended to give correct
answers in either mode. However, it is not known whether a proportion of the LGB
population gave incorrect, socially desirable answers in interviewer-administered
mode, or avoided giving substantive answers in self-completion mode.
Estimates of the heterosexual population tended to be higher when the question was
interviewer-administered compared with self-administered questions. Respondents
would have been able to ask interviewers for help in understanding the question or
response categories, which might have reduced the amount of missing data. Another
possible reason is that a proportion of heterosexual respondents avoided giving
substantive answers to self-administered questions, but answered truthfully when
‘forced’ to answer on interviewer-administered surveys.
The questions raised above will be covered in the remaining research. Issues which
will be considered include:
• whether respondents should be allowed to avoid giving substantive answers, in
different modes, and, if so, how;
• whether respondents might give socially desirable answers, and what
characteristics they have;
• and the optimal balance between minimising missing data and maximising data
accuracy.
Issues not addressed: Some issues of relevance to the development of a question for
ONS and other surveys were not covered by the review. These included CASI or
telephone methods; administering the question face-to-face to more than one
household member; and proxy data. One survey had a longitudinal design, but little
was discovered about effects of asking the question repeatedly over time. These issues
will be addressed by the ONS project.
5
2 Introduction
A project was initiated that aimed to provide advice on best practice with regard to
data collection in this field, and also examine the feasibility of providing benchmark
data. The primary outputs from this project will be a question, or suite of questions,
asking people to self-identify to a particular sexual orientation, along with advice on
administration. Alongside the question(s), a user guide will be produced discussing
the conceptual issues as well as the methodological issues, such as context and mode
effects. General information on the project is available at:
http://www.statistics.gov.uk/about/data/measuring-equality/sexual-identity/default.asp
The methodology is described in Wilmot (2007).
There may be issues relating to the administration of this question amongst particular
groups in society. For example, people’s cultural, religious or political beliefs may be
offended. The age of the respondent may also relate to their ability to answer or their
attitudes towards such questioning. Examining personal barriers to response would
enable us to suggest how the methodology could be improved. Respondents’ ability to
comprehend the questions and answers has clear implications for the quality of the
data. Furthermore, the comprehension and attitudes of the interviewers is important
since they are often called upon to justify the inclusion of a particular topic in a
survey.
Data Collection Methodology (DCM) branch was commissioned to develop, test and
evaluate question(s) on sexual identity for implementation in a government social
survey context, with particular reference to the IHS. This report forms part of the
initial exploratory phase of the project.
As part of the exploratory phase of the project, ONS conducted a review of the
experiences of other organisations in the UK which have administered survey
questions about ‘sexual identity’ or ‘sexual orientation’, reported in this document.
The purpose of the review was to obtain insight which might be useful in informing
6
subsequent stages of the project, including qualitative exploration of the issues of
interest with the general public and the design and testing of a question, or questions,
for use on social surveys.
A less detailed review of sexual identity questions asked in other countries was
conducted at the same time – this is reported separately (Taylor [a]).
2
See http://www.gaydata.org/ms001_Index.html
7
2.3 Methodology
Data for the review was collected primarily by means of semi-structured telephone
interviews conducted with project managers or others with relevant knowledge (at
either the sponsoring organisation or the survey design/fieldwork contractor). Further
information was provided by email. In addition, research and technical reports,
questionnaires and other field materials, and data estimates were all referred to.
In Appendix B detailed information is provided about each survey, obtained from the
interviews and documents. General information, to provide context for comparison
and assessment of the sexual identity question, includes when it was conducted, who
the sponsor was, the sample design and mode of administration and survey response.
Specific information about the sexual identity question includes the question wording
and response categories; method of administration; how refusals and other missing
information were recorded; estimates (where publicly available); information on
instructions and training given to interviewers; and feedback from respondents and
interviewers on the experience of administering a sexual identity question (e.g.
comprehension and acceptability).
Much of the information, including estimates from the questions where available, is
summarised in a table at Appendix A, to enable easier comparison between surveys
and modes.
Information for some surveys is incomplete, because it was not available, or because
consultation could not always be with researchers involved day to day in the projects,
or because not all the topics could be covered with each organisation. Estimates were
not available for every survey reviewed. Thanks go to the suppliers of the information
at the organisations acknowledged in Table 1 below.
8
Table 1 Surveys reviewed and suppliers of information
Note that during the period when this review was conducted the Adult Psychiatric
Morbidity Survey (APMS, 2006-7) and Citizenship Survey (2007-8) introduced
questions on sexual identity into the field. Some information was provided by,
respectively, NatCen (the survey contractor) and Communities and Local Government
(the survey sponsor), though no information was available regarding estimates and
little feedback from the field. Generally they are not included in the survey
descriptions and analysis below, except when referred to explicitly.
As part of the development project, in 2006 ONS conducted experiments asking about
sexual identity on the National Statistics Omnibus Survey. A separate report on this
work has been written (Taylor [b]).
3 Survey designs
This section provides some contextual information on the surveys’ sample designs
and modes of administering the sexual identity question, both of which were varied.
9
3.1 Sample designs and population coverage
Seven surveys were conducted in private households, while three were in institutions
or establishments.
Eight of the surveys used random probability sampling. The exceptions were the Fair
Treatment at Work Survey which was included on an omnibus survey employing
quota sampling, and the National Mental Health and Learning Disability Ethnicity
Census (2006) which was a census of inpatients in mental health and learning
disability establishments.
Five of the surveys sampled the general population. They varied geographically as to
whether they covered Great Britain, particular countries within the UK, London, or a
local authority area. Only one general population survey, British Social Attitudes
Survey, covered Great Britain as a whole, and it should be noted that its sexual
identity question was not intended to measure the prevalence of the LGB population.
The two DTI surveys were limited to people of working age who were currently
working as employees or had worked within the last two years. The Civil Service
Diversity Survey sampled civil servants in all departments. The National Mental
Health and Ethnicity Census 2005 Service User Survey (2005) and the National
Mental Health and Learning Disability Ethnicity Census (2006) covered in-patients in
NHS and private/voluntary mental health and learning disability establishments.
The ages of respondents included by the different surveys varied from 15 and over, to
16 and over, and 18 and over. All household surveys selected one person to respond,
except the Newham Household Panel Survey which interviewed all household adults.
Among the institutional/establishment surveys, two sampled individuals while the
third was a census.
Concurrent personal interviewing was not conducted on any of the surveys for the
sexual identity question (the Newham Household Panel Survey utilised paper self-
completion). Thus no experience of the data collection environment used on the
10
majority of ONS’s social surveys could be drawn on to inform the design of ONS’s
questioning.
4 Analysis
Analysis of the information collected provided insight which will be used to inform
the design of the questioning for use on UK government social surveys, and the topics
to be covered in the qualitative research being conducted as part of the development
process. It also generated hypotheses for further testing both qualitatively and
quantitatively. Areas for further research are identified throughout the review.
Comparisons were sometimes difficult to make between the surveys, and conclusions
were hard to draw about the effect of mode, question design, location or context on
the estimates. A number of confounding factors needed to be taken into account. The
number of surveys included in the review was small, overall and for each mode of
administration. The sample designs and survey populations differed. The question
designs also varied, such as in the ways in which respondents could avoid providing
their sexual identity, or answer ‘other’, ‘don’t know’ or ‘do not wish to
answer’/‘prefer to not say’. The robustness of the estimates obtained was in some
cases perhaps doubtful, due to small sample sizes and/or low response rates (both unit
and item), and therefore subject potentially to sample variation and/or non-response
bias.
In this review terms such as ‘the LGB population’ and ‘heterosexual respondents’ are
used as shorthand for people who selected particular response categories.
3
The report Final Regulatory Impact Assessment: Civil Partnership Act 2004 (available at
http://www.dti.gov.uk/files/file23829.pdf) states: “Whilst no specific data is available, a wide range
of research suggests that lesbian, gay and bisexual people constitute 5-7% of the total adult
population…” The figure was based on the findings in a number of studies, from various countries,
conducted among differing sample populations and measuring different dimensions of sexual identity.
A footnote acknowledges that there is very little reliable data about the size of the LGB population.
11
4.2 Estimates from the surveys
The estimates from the sexual identity question were obtained for most of the surveys.
Weighted data was supplied in most cases. The percentages are shown in detail in the
table at Appendix A. Estimates were not available for two of the ten surveys;
however, these were the two with mental health inpatients, which were the least
typical surveys in terms of sample and design, so estimates from them would,
perhaps, be misleading anyway.
In this section, the estimates are discussed at a general level, while in following
sections more detailed consideration is given to the relationship of estimates to
particular features of the question design.
For those surveys where estimates were available, the table below shows the
minimum and maximum percentage rates for each substantive sexual identity
category and type of missing value. For the purpose of this review, ‘substantive’
categories means those covering the different sexual identities (including ‘other’ or
similar), while ‘missing data’ refers to responses from categories presented to
respondents such as ‘do not wish to answer’, as well as don’t knows and refusals.
As will be described in Sections 4.5 and 4.6 the category breakdowns and labels
varied across surveys – in the table below they are consolidated. And the types of
missing values did not all apply to each survey.
The category percentages are based on total cases, not just valid cases. The ranges are
shown for all surveys, and separately for interviewer-administered and self-
administered.
Table 2 Minimum and maximum percentage rates for each substantive sexual identity
category and type of missing data, by survey mode
The estimates had to be treated with caution, since most of the surveys were limited in
their population or geographical coverage, and might be subject to sample design
variation. The sexual identity question on the only GB-wide general population
survey was not intended to measure the prevalence of LGB people.
That said, the main observation is that rates of lesbian, gay and bisexual respondents,
combined, ranged from 0.3% to 3%. This is lower than the 5% to 7% government
estimate.
The surveys which recorded the highest rates of LGB respondents were not of the
general population. They might reflect different compositions, with regard to sexual
12
identity, among the specific populations sampled (civil servants and people currently
or recently employed) compared with the general population.
The highest estimate of LGB respondents among any of the general population
surveys was 2.1%.
The estimates varied as to whether prevalence of lesbian & gay (combined) was
higher than bisexual, or vice versa, though lesbian & gay tended to be higher.
Not all the researchers contacted commented on their estimates. But among them were
some who acknowledged that their own figures were unreliable or underestimated the
LGB population – in the words of one, “by any reasonable standards the results are
pretty hopeless…baldly unsatisfactory”. The report on the Scottish Census Small
Scale test questioned the “accuracy” and “utility” of the data, due to high unit and
item non-response.
A hypothesis put forward by researchers on one survey, comparing their estimate with
the benchmark, was that the clustered sample design might have led to under-
representation of the LGB population, if they were concentrated in particular
geographical areas. Whilst factors including survey coverage, sample design and non-
response might explain the low estimates of LGB respondents to some degree, albeit
that the benchmark against which they are being compared is of uncertain reliability,
the findings of the review do provide some useful insight into the effects of the
question design and mode of administration.
The analysis posed a question about the balance to be achieved between, on one hand,
reducing missing values and obtaining as much substantive data as possible, and, on
the other hand, the accuracy of the substantive data. A key question is whether
accuracy is compromised when the question is interviewer-administered, by, for
example, respondents giving more socially desirable answers. This subject is
discussed in section 4.8.
Further research
ONS’s programme of development work will be addressing the concept which is
intended to be measured and whether it can be captured in a single question or
13
requires more than one (see also sections 4.4. and 4.5). The work will consider the
points of view of both members of the public and the analysts and users of the data. If
more than one question is ideally required, the feasibility and appropriateness of
asking more than one question in the context of government social surveys with
different purposes will be considered.
All surveys had at least one question about a subject where a person’s sexual identity
could have a bearing, e.g. experience of discrimination, service provision, diversity
issues or social attitudes. The British Social Attitudes Survey question was used to
identify respondents with particular identities for subsequent questions.
Questions tended not to include any mention of the purpose of asking (see section 4.4
on preambles and explanations below). Little was discovered about the public’s
perceptions of the purpose of asking about sexual identity. Focus group research
conducted during the development of one survey indicated that people accepted the
use of the question in monitoring diversity and unfairness in the workplace.
A different purpose applied to the Scottish Census Small Scale Test, which was to
assess the feasibility of asking such a question on the 2011 Census in Scotland.
Due to the confounding factors mentioned above it was difficult to assess what effect,
if any, the perceived purpose and context of the questions had on the data quality (e.g.
how carefully respondents considered the question), or on non-response (whether unit
or item, due, for example, to sensitivity over subject matter or data confidentiality).
Further research
The purpose, context and location of a sexual identity question will be discussed with
participants in ONS’s planned qualitative research.
14
4.4 Preambles to and explanations of the question
Explanations specifically relating to the sexual identity question – whether about its
purpose or to provide confidentiality assurances - were generally not provided. There
was no conclusive evidence as to the relative benefits and drawbacks of a prescribed
preamble, or an optional explanation, to the sexual identity question. The limited
feedback available was from interviewers’ perspectives, not respondents’.
A further clause on confidentiality (“As with all our questions, your answers will be
kept strictly confidential”) was included in the question which was cognitively tested
but dropped prior to the survey going live. It was dropped because interviewers felt it
was unnecessary and made the question more of a “big deal”.
Two examples were given of a preamble being dropped after initial use on
interviewer-administered surveys. The preambles addressed confidentiality and/or
attempted to explain terminology.
Another preamble was dropped after being used in the field for some time. This
preamble explained that
“Some people have a partner of the same sex as themselves (homosexual)
while others have a partner of the opposite sex (heterosexual).”
15
The reason for it being dropped was not ascertained. This preamble, being behaviour-
related, would not be appropriate for a question about self-perceived sexual identity.
It is not known how often it was used by interviewers or its reception by respondents.
Further research
The necessity for a preamble or explanation, and the dimensions which should be
covered, is an issue for exploration in the exploratory and main stages of ONS’s
research. That is, whether respondents require explanation of the question’s meaning
and language to aid comprehension and improve accuracy, or of its purpose and
assurances of confidentiality to maximise unit response. Or, on the other hand,
whether to mention these things draws respondents’ attention to its potential
sensitivity which would otherwise not occur to them. Omnibus Survey interviewer
feedback will also be pertinent.
16
Table 3 Question stems and response categories
17
Adult Psychiatric Which statement best Entirely heterosexual (attracted to persons of
Morbidity Survey describes your sexual the opposite sex)
(Version A) orientation? Mostly heterosexual, some homosexual
This means sexual feelings, feelings
whether or not you have had Bisexual (equally attracted to men and women)
any sexual partners. Mostly homosexual, some heterosexual
feelings)
Entirely homosexual (attracted to persons of
the same sex)
Other
Adult Psychiatric Please choose the answer Completely heterosexual
Morbidity Survey below that best describes how Mainly heterosexual
(Version B) you Bisexual
Think of yourself...: Mainly gay or lesbian
Completely gay or lesbian
Other
It was difficult to assess the effect of different question stem wordings on estimates –
even between those which did and did not mention ‘orientation’ or ‘sexuality’ - since
there were too many confounding factors.
The majority of questions did not refer explicitly to the subject in the stem, leaving it
to the response categories to convey it. Of those which did mention the subject, the
reference was either to ‘sexual orientation’ or ‘sexuality’. (The Citizenship Survey
referred to ‘sexual identity’).
Definition provided
Only the Adult Psychiatric Morbidity Survey provided any definition of the
question’s subject, one of the two variants explaining that ‘sexual orientation’ meant
“sexual feelings, whether or not you have had any sexual partners”. Such a definition
18
of sexual orientation was more limited than that which the ONS is intending to
measure (see section 2.2).
One exception asked respondents to ‘state’ their sexual orientation; the designer of
this question perhaps expected that answers would be more ready-made or concrete.
The latter approach is more similar to that often used in other countries (and on the
second NS Omnibus trial). This approach asks ‘Do you consider yourself to be…’
which perhaps anticipates a more self-assertive assessment compared to the passivity
of a category describing them.
Perspective
As for whose point of view should be provided, questions varied as to whether they
made it clear that it was respondents’ own perception (e.g.‘…how you think of
yourself’) or left it more open to interpretation (e.g. ‘…best describes you’).
Respondent comprehension
What respondents understood the question to be asking about is uncertain. The
evidence, from a limited amount of feedback from cognitive testing and written
comments by respondents to one postal survey, suggests the need for some definition
of what is being measured to be given.
Comments from respondents to the Scottish Census Small Scale Test included those
that "suggested they had possibly misunderstood or misinterpreted the question".
These included an elderly woman who answered the question with a sexual
orientation but did not think the question applied to her, and another who had to think
what the question meant but answered as best she could.
In the cognitive testing of the Citizenship Survey question it was found that
respondents understanding of it included “who you like as a partner” and “it’s what
you are or your sexual preference is” . These comments suggest respondents were
considering behavioural and/or identity-related dimensions, and perhaps not
consistently.
Further research
The conceptualisation of sexual identity and the need or otherwise to make reference
to it in the question stem will be explored in the ONS research. The need to provide a
definition, and if so what it should be, will also be addressed. So will the need to
make clear the perspective to be taken – that is, the respondent’s self-perception rather
than it being left more ambivalent. Whether it is better to ask if respondents consider
themselves to have a particular sexual identity, implying they have considered it
previously, or to choose a category which best describes them, which allows for them
not having given it much thought, will also be assessed.
19
4.5.2 Response categories
One of the exceptions, the Policing for London Survey, included categories from
‘completely heterosexual’, through ‘mainly heterosexual’, ‘bisexual’, ‘mainly gay or
lesbian’ to ‘completely gay or lesbian’. While self-perceived sexual identity might
indeed be located on such a continuum, such categories might imply that behaviour or
desire was being asked about, without the question stem being specific. Indeed on the
other exception, the Adult Psychiatric Morbidity Survey, the question stem did
include a definition of sexual identity, as ‘sexual feelings’. It also used slightly
different vague quantifiers, substituting ‘entirely’ for ‘completely’ and ‘mostly’ for
‘mainly’, though which, if either, is better is probably unimportant.
Again the confounding factors make it difficult to compare the estimates from discrete
or ‘continuous’ [sic] categories and draw any conclusions. From an analytical
perspective, though, some interesting questions arise, assuming that it is possible to
capture sexual identity with a single question.
Firstly, would the 0.4% of respondents choosing the ‘mainly’ gay/lesbian category
have placed themselves in the gay/lesbian category, the bisexual category or another
category, if given absolute choices? The importance of this question perhaps is
dependent on the extent to which, analytically, bisexual respondents need to be
distinguishable from gay and lesbian respondents.
Secondly, using less absolute categories would have implications for measuring the
prevalence of LGB people in the population. For example, should the total LGB
population include respondents in the ‘mainly heterosexual’ category, which implies
some degree of LGB orientation? On the Policing for London Survey, the LGB rate
was 1% when including the bisexual and ‘completely’ or ‘mainly’ gay/lesbian
categories. If the ‘mainly heterosexual’ category, chosen by 2.3% of respondents, was
included, the total LGB proportion thus increased to 3.3%. However, it is not known
on which dimension(s) of sexual identity respondents were basing their answers.
Identity might be more discrete than other dimensions such as behaviour and
attraction. Making the question wording more specific would be likely to have an
affect.
Furthermore, all surveys allowed only one response option to be chosen, rather than
the possibility of respondents considering themselves to have more than one sexual
identity simultaneously (which might, in theory, be the case in different contexts – for
example, domestic, work, social – and depending on which dimensions of sexual
identity – behaviour, attraction, lifestyle etc - they wished to consider).
20
Further research
The issue of whether people consider their sexual identity to be absolute or more
qualified, and how they would respond if given absolute categories, will be covered in
the qualitative research exploring the conceptualisation of sexual identity. Whether
people consider themselves to have a single sexual identity (consistent across all
dimensions, or considered to be the ‘overall’ identity when there are differences
between particular dimensions) or more multiple, fluid identities from which they
cannot derive an overall identity, will be another topic.
‘Heterosexual’ was used both on its own, and in combination with ‘straight’, but the
latter was never used on its own. No other term for heterosexual was used.
‘Gay’ and ‘lesbian’ were used both with and without ‘homosexual’. No other terms
for homosexual were used. ‘Homosexual’ was used alone only in the Adult
Psychiatric Morbidity Survey.
When both formal and colloquial terms were used in the same category, the order
varied for ‘heterosexual’ and ‘straight’, but ‘gay’ and ‘lesbian’ always preceded
‘homosexual’. There were instances where the second term used in a category was in
parentheses, e.g. ‘heterosexual (straight)’. The colloquial terms were sometimes given
in inverted commas, e.g. ‘straight’, perhaps to convey that they were less formal.
There were instances where a colloquial term was included for some categories in the
set but not for others; for example, a category for ‘heterosexual’ (without reference to
straight) and one for ‘gay/lesbian’ (without reference to homosexual).
There did not appear to be any meaningful difference in the estimates obtained from
questions which could be put down to using particular variants of gay, lesbian and
homosexual.
‘Gay’ and ‘lesbian’ were usually combined in a single category. There were two
instances of them as separate categories. There appeared to be little difference in the
estimates of homosexual respondents obtained from separate or combined categories,
though the confounding factors make this difficult to assess. Since respondent’s sex is
generally collected by surveys, there would seem to be no need to distinguish between
homosexual men and women, unless the ONS qualitative research reveals either or
both groups to strongly object to being together in a single category.
21
Explanation/definition of terms provided to respondents
Only two surveys included a definition of each category. In one instance, the National
Mental Health…Survey (2005), the explanation was expressed in terms of having a
‘like’ for people of the same or opposite sex, or both sexes. In the other, the Adult
Psychiatric Morbidity Survey, the definition was in terms of ‘attraction’ to the same
or opposite sex, and ‘feelings’ (as was also defined in the question stem). Such
explanations seem more appropriate to questions about behaviour or attraction
specifically, and would probably not be appropriate for a question about identity.
The need or otherwise for an explanation of each category, or specific categories, will
be explored in the qualitative work.
That survey did not use the word ‘straight’. Such comprehension problems were also
cited in McManus. ‘Heterosexual’ has also been reported to be unfamiliar or
misunderstood, particularly among older or less well educated respondents, from
feedback from interviewers working on the NS Omnibus Survey trials. The cognitive
testing of the Citizenship Survey question found that there was some “confusion”
about the term ‘heterosexual’ – even though the term ‘straight’ was also given – and a
“process of elimination” used to decide the correct category.
A resulting hypothesis is that some heterosexual respondents are less likely to have
considered ‘sexual identity’ and to understand and use the associated terminology,
than LGB or younger or more highly educated heterosexuals.
Regarding the presence or absence of the term ‘straight’ in the heterosexual category,
the two postal data collections did not include it. The high rates of ‘prefer not to say’
or ‘not answered’ they produced might be explained partly by this.
It would be useful to understand how respondents who did not understand terms
answered the question and the implications for the accuracy of data. That is, did they
choose a substantive category, and if so which, or did they choose ‘other/none’?
Alternatively, did they choose ‘do not wish to answer’ or not provide any answer at
all?
Further research
The qualitative work being conducted by ONS will explore the terminology used,
understood and preferred by respondents of all sexual identities, and what are the
22
appropriate response categories. It will also investigate respondents’ answering
strategies when they do not have a good understanding of the terms used.
Category order
The questions generally, but not always, placed the heterosexual/straight category
first, followed by homosexual/gay/lesbian, then bisexual. The rationale seems likely
to have been to reflect the anticipated order of incidence of the main categories
(though while heterosexual is undoubtedly most prevalent, the relative incidence of
homosexual and bisexual is less certain and might vary according to what concept is
measured – e.g. identity, behaviour, attraction). One survey placed the categories in
alphabetical order. It can only be speculated if this was an attempt to avoid implying
any kind of hierarchy.
No obvious effect of the category order, such as primacy, was apparent from the
estimates. It can be observed that the survey which placed ‘bisexual’ first obtained the
highest rate for that category across all surveys. However, other surveys had higher
rates for bisexual than for gay/lesbian, even when it was later in the order.
Further research
Category order will be considered as part of the ONS development work.
‘Other’ category
The surveys varied as to whether or not they included an ‘other’ category (or similar).
Where ‘other’ was not presented to respondents, the research assumption, presumably,
was that the categories offered were comprehensive.
When questions did not include an ‘other’ option, it is not known how respondents
who did not think any of the substantive categories applied answered (e.g. answer one
of the substantive categories, or don’t know, or not provide an answer).
23
The availability of ‘other’, ‘none’ and ‘can’t choose’ categories, and the potential
need for any alternative substantive categories, will be another topic of investigation
during the qualitative stages of the development project.
Alternatives to ‘other’
Alternative categories to ‘other’ were used in two instances. One self-completion
question included: ‘If none of the above applies, (please write) I am_______’, while
another offered ‘Can’t choose’ as a category (with no follow up ‘specify’ question).
There appeared to have been some fluctuation in the rates across the ‘other’, ‘none…’
and ‘can’t choose’ categories. Intuitively, it is possible that each was perceived by
respondents differently. Unfortunately the written responses from the ‘none…’
category were not available for review. Any differences can only be hypothesised:
perhaps ‘none…’ and can’t choose’ were more inclusive for respondents who did not
consider themselves to have any sexual identity. Or perhaps they provided a more
suitable alternative for respondents who didn’t understand the meaning of the sexual
identity categories.
Further research
The need to provide an ‘other’ category and a ‘specify question’, and the way in
which they are used, will be addressed (although analysis of ‘other – specify’ data
from the NS Omnibus trial suggests it is a minor issue). Also to be considered are
whether ‘none’ or ‘can’t choose’ should be considered as a category, and provision of
alternative categories at the main question or a follow-up question.
24
administered surveys they were higher, from 1.4% to 25.0%. This time the lower
figure was the outlier; without it the lower end was 8.0%.
When ‘do not wish to answer’ was not included on the show cards, interviewers were
instead able to code spontaneous refusals. Refusals ranged from 0.2% to 9%, though
see ‘Interviewer issues’ below regarding the latter figure. These surveys also included
an ‘other’ category, or allowed ‘don’t know’ to be recorded (though none of the
estimates provided contained cases in both ‘other’ and ‘don’t know’).
Despite the inclusion of a ‘do not wish to answer’ category the Scottish Census Small
Scale Test and the Newham Household Panel Survey also experienced high levels of
the question being left unanswered.
25
4.6.2 ‘Don’t know’ responses
Only one survey allowed the recording of ‘don’t know’ answers, if given
spontaneously to the interviewer. At 1.6%, these responses outnumbered refusals
(0.7%). It is worth noting that this survey did not use the term ‘straight’; as previously
mentioned, evidence suggests some people do not know the meaning of
‘heterosexual’, which might explain this rate to some extent.
It is not possible to gauge whether respondents answered ‘don’t know’ due to their
lack of comprehension of the question or knowledge of terms, and/or their inability to
decide what their sexual identity was.
Nor is it possible to say how many respondents would answer ‘don’t know’ if it was
possible to record (other than on the basis of the one survey which recorded it); or
whether they would instead not answer the question at all, would answer ‘other’ or
‘do not wish to answer’, guess, or use a process of elimination to answer with one of
the substantive categories.
The Newham Household Panel Survey found that white respondents were more likely
to answer the sexual identity question than respondents from Asian and ‘other’ ethnic
groups:
“In all waves those giving their ethnic origin as 'white' were significantly
more likely to answer the question; whilst those giving their ethnic origin as
'Asian' or 'Other' were significantly less likely to answer the question. All
'Asian' categories displayed this tendency. Around a third of respondents from
Asian origins did not answer the sexuality questions compared with around 1
in 7 of those with a ‘White' ethnic origin.”
Further to not answering the question at all, respondents whose first language was not
English were, on average, slightly more likely to answer ‘none of the above’ and
much more likely to answer ‘I do not wish to answer this question’. It can be
hypothesised that some respondents from minority ethnic groups had cultural or
religious beliefs resulting in objection to being asked the question or unwillingness to
answer it. It is not possible to say whether there were any problems with
comprehension due to English not being respondents’ main language.
The Policing for London Survey carried out an ethnic boost to its sample. The figures
on missing data by ethnicity were not obtained for that or any other survey, so no
comparison with the Newham findings could be made. However, an interviewer
commented that whilst there were problems with comprehension, including among
respondents with limited English, “I don't recall any problems of privacy,
confidentiality or acceptability, and people were quite happy to try to answer the
question.”
26
Further research
Minority ethnic groups’ views of the question among will be an area which ONS
intends to explore in the qualitative research.
Following on from the above references to missing data, and respondent beliefs,
embarrassment and objection to the question, the review will now consider more
generally the topics of privacy in the survey environment (i.e. with regard to other
people in the household or an interviewer), confidentiality (i.e. disclosure of
information to other people), and acceptability of asking about sexual identity (i.e.
feelings about the personal nature of the subject and of invasion of privacy), and their
affect on survey response, and the completeness and accuracy of data.
In focus groups conducted as part of the development of the Civil Service Diversity
survey, concerns had been expressed at respondents being identifiable individually.
Such concerns might have been higher in the context of a survey conducted for
respondents’ employers than they would be in surveys conducted for sponsors less
immediately related to them.
Further research
People’s views on confidentiality, and fears of the likelihood and consequences of
disclosure to other parties, is another subject to be covered in the qualitative research.
27
4.7.2 Providing privacy in the survey environment
Only limited information was available about respondents’ views on privacy in the
survey environment. Researchers were generally unaware of any widespread concern
fed back by interviewers. There was only a little evidence of respondent concern.
Interviewer-administered questions
The common approach taken to dealing with providing privacy to respondents on the
interviewer-administered surveys was the use of concealed response show cards. That
is, respondents were asked to give a letter or number corresponding to the response
category, rather than state the answer in words. The letters or numbers were usually,
though not always, randomised – i.e. they were not immediately sequential or in
alphabetical/numerical order (e.g. ‘R, D, P, H’ rather than ‘A, B, C, D’). This
approach was intended to reduce the potential for embarrassment. On the Fair
Treatment at Work Survey pilot the show card question was "not met with any
resistance".
However, the concealed response show card method was only intended to provide
privacy between the respondent and any people present other than the interviewer. No
surveys made any assurances to respondents that the interviewer would be unaware of
the response category to which the number or letter corresponded. Indeed, it was said
by a respondent to one survey that interviewers knew the answer so it was “clearly
not anonymous”. The issue of social desirability due to the presence of interviewers is
addressed in section 4.8.
In one of the surveys conducted in mental health establishments, the intention was
that the interview be conducted in a private place but that was dependent on
availability. On the National Mental Health … Census (2006), it is unknown what
efforts to provide privacy from other staff or patients were made by the staff
conducting it.
Self-administered questions
Respondent preferences for interviewer-administered or self-administered questions
are little known.
28
One researcher recollected that a respondent on a self-completion survey had
contacted the organisation to say he had not “felt comfortable” answering the
question, as the answer might be seen by other people (it was not specified whether
this meant other people known to the respondent seeing the form, or the people
collecting or using the data).
The Newham Household Panel Survey was the only survey to collect data from more
than one household member. It is not known whether the high rates of missing data it
experienced were in any way connected to this, for example if there were concerns
that other household members would be able to see answers.
Further research
The subject of within-household privacy and the concurrent interviewing
environment, and views on interviewer- and self-administered modes will be covered
in ONS’s further research.
Survey response
No researcher gave any indication that survey response rates had been affected by the
inclusion of a question on sexual identity.
The Scottish Census Small Scale Test experimented with using four variants of the
questionnaire, to examine the effect on response rates of asking about sexual
orientation. Each variant employed a different combination of two questions: one on
sexual orientation and one on experience of discrimination on the basis of a number of
grounds, including sexual orientation. The combinations were: both questions
(separated by several other questions; with sexual orientation asked second); one
question but not the other; and neither question. Unit and item response rates were
very similar across the variants, suggesting there was no effect, either of asking about
sexual orientation per se, or of asking it with and without the context of a question on
the subject of discrimination.
Item response
The relatively low rates of missing data when the question was interviewer-
administered would seem to indicate that on the surface questions were broadly
acceptable to respondents, in the sense that they did not generally didn’t refuse to give
a substantive answer.
However the rates of refusals or not answering the question were higher on the self-
completion surveys. This suggested that there might after all be an issue of invasion of
privacy or objection to being asked the question, which was not so apparent from the
interviewer-administered surveys.
Indeed, this was borne out in feedback to the Scottish Census Small Scale Test.
Respondents were asked to indicate how happy they felt about each question. The test
report stated that "The question on sexual orientation stands out with around 11% of
respondents selecting very unhappy or unhappy" compared to less than 5% for most
questions. Respondents could also write comments. For the sexual orientation
29
question they "ranged from fully accepting to hostile". Concerns expressed included:
"not understanding the need for the data"; "offence" and "unhappiness at the intrusive
nature of the question"; and concerns about confidentiality (e.g. being asked for their
postcode).
Such concerns did not necessarily "translate into refusal to record an answer". As
reported above, the response rates were similar to the versions of the questionnaire
with and without the sexual orientation question, suggesting that the question did not
cause people not to respond to the survey. Conversely respondents who "accepted the
question without reservation" included some who answered 'prefer not to say'. It
would therefore appear there was a distinction to be made between objecting to being
asked the question, and objecting to give an answer, and that the two positions did not
necessarily correspond.
Non-responders to the Scottish Census Small Scale Test question "generally also
declined to record any comments".
Such compromising of the data could occur if respondents feel pressured into
providing an answer to a question they would rather avoid. If respondents are unaware
of how to avoid answering – the absence of a ‘do not wish to answer’ category and, in
interviewer-administered mode, not knowing or remembering that they can
spontaneously refuse – they might instead give a substantive answer, but one which is
inaccurate.
30
avoiding giving a substantive answer. Respondents who wished to avoid giving a
substantive answer for one reason or other were more at liberty to do so when self-
administering – when given the chance, they were more likely to take it.
The generally low rates of homosexual and bisexual obtained, when compared to the
government 5% to 7% estimate, suggest it is possible that some respondents to
interviewer-administered surveys were giving a socially desirable answer, rather than
being willing to respond with one of the ‘sexual minority’ categories, or saying to the
interviewer ‘I don’t want to answer’ or ‘I don’t know’. The latter two answers might
have been avoided due to respondents feeling the interviewer or someone else might
have made an inference about their sexual identity.
However, a number of factors make it difficult to make firm conclusions about any
social desirability effect.
Firstly, the reliability of the benchmark proportion of the population which is LGB.
Thirdly, the rates of missing data on the self-administered surveys were generally
higher than the estimated proportion of the population which is LGB. And LGB
estimates appeared to be more consistent across modes than those for heterosexuals.
So it might be that LGB respondents tended to give correct answers in either mode.
However, it is not known whether a proportion of the LGB population gave incorrect,
socially desirable answers in interviewer-administered mode, or avoided giving
substantive answers in self-completion mode.
Estimates of the heterosexual population tended to be higher when the question was
interviewer-administered, ranging from 88% to 99%, compared with self-
administered questions, where they were between 70% and 95%. Respondents on
interviewer-administered surveys who did not understand the question or response
categories were able to ask the interviewer for help and so give a substantive answer,
which might have reduced the amount of missing data. On self-administered surveys
they might have been less able to check with someone. Another possible reason is that
a proportion of heterosexual respondents avoided giving substantive answers to self-
administered questions, but answered truthfully when ‘forced’ to answer on
interviewer-administered surveys.
As well as the possibility that some respondents gave deliberately untruthful answers
for social desirability reasons, other forms of satisficing might have been at play.
Respondents might have given any answer just to satisfy the interviewer and/or avoid
having to consider a subject which might be embarrassing or sensitive, rather than
31
deliberately being untruthful. For example, by choosing the first category in the list
presented (the ‘primacy’ effect).
Further research
The questions raised above on missing data, confidentiality, privacy and acceptability
will be covered in the further research. Issues which will be considered include:
• whether respondents should be allowed to avoid giving substantive answers, in
different modes, and, if so, how;
• whether respondents might give socially desirable answers, and what
characteristics they have;
• and the optimal balance between minimising missing data and maximising data
accuracy that can be achieved.
32
experienced a high rate of refusal to the question. The researchers attributed this to a
“small number” of interviewers coding a refusal for all or the majority of their
interviews, suggesting that some “do not feel able to ask this question…” The
characteristics of these interviewers (e.g. age, sex, experience) were not discovered
and any correlation with high levels of refusals unknown. It is worth noting that the
interviewers on this survey received written instructions but received no briefing,
whereas on other surveys with lower refusal rates, interviewers had been briefed. It
was concluded that “future attempts to ask a question on sexual identity must pay
particular attention to interviewer training and briefing to ensure interviewers feel
able to ask the question.”
Further research
This recommendation and interviewer-related issues generally will be addressed by
ONS as part of its development work.
Further research
ONS will be considering the effect of asking about sexual identity on agreement to
take part in further waves on panel surveys. It will also look at whether the question
need only be asked at the first wave or each wave.
4.11 Issues about asking sexual identity not addressed by the review
There are some important issues pertinent to the potential introduction of a sexual
identity question on ONS surveys such as the Integrated Household Survey, which
have not been addressed in this review.
None of the surveys on which the sexual identity question was interviewer-
administered interviewed more than one household member, unlike the majority of
ONS’s surveys. Therefore no issues relating to concurrent administration were
covered. Nor were any issues to do with either telephone interviewing or proxy
interviewing, both of which are conducted on some ONS surveys.
Further research
The three topics above will all be addressed in ONS’s development project.
33
5 References
Wilmot A (2007), ONS, ‘In search of a question on sexual identity’ - paper presented
at the 62nd Annual Conference of the American Association of Public Opinion
Research, May 2007.
6 Appendices
34
[blank page]
35
Appendix A: Summary table – mode/sample design/sexual identity question designs and estimates
Mode/sample design/sexual identity question design/survey response rate Estimates: Substantive categories Estimates: Missing values (inc. Base
KEY: n/a = not applicable; dk = no data (Percentages are weighted; of all cases) presented)
rate %
Survey response
/straight
Heterosexual
gay/lesbian
Homosexual/
Bisexual
Total LGB
similar)
Other
categories
Additional
categories
substantive
Total % in
DK
(presented)
not to answer
Do not wish/Prefer
code)
Ref/no answer (int
SC
Unanswered on
values
Total % missing
(w)=weighted
(u)=Unweighted
Survey Mode of Sample Question stem Response Categories
administ-
ration
(or
Northern Interviewer- Random probability; general Can you tell me which of I am 'gay' or 'lesbian' 61 97.0 0.5 1.2 1.7 0.0 n/a 98.7 n/a 1.3 n/a n/a 1.3 1199 (w)
Ireland Life administered population (NI); 1 these best describes (homosexual);
and Times (CAPI; person/hhold (18>) you? Please just give me I am heterosexual or 'straight';
Survey showcard) the number on the card. I am bi-sexual;
Other;
I do not wish to answer this
question
Policing for Interviewer- Random probability; general Please choose a letter Completely heterosexual 49 completely: completely: 0.2 (1.0 or n/a n/a 97.7 1.6 n/a 0.7 n/a 2.3 2800 (u)
London Survey administered population (London) + from this card which best Mainly heterosexual 94.4 0.4 3.3?)
(CAPI; ethnic boost; 1 person/hhold describes how you would Bisexual mainly: mainly:
showcard) (15>). think of yourself? Mainly gay or lesbian 2.3 0.4
Completely gay or lesbian
DTI Interviewer- Random probability; men Which of these best Straight/heterosexual 58 98.9 0.7 0.2 0.9 0.1 n/a 99.8 n/a n/a 0.2 n/a 0.2 1038 (w)
Employees’ administered 16-64, women 16-59 describes you? - please Gay/lesbian/homosexual
Awareness (CAPI; employees now/in the last 2 just give me the number Bisexual
Knowledge showcard) years; 1 person/hhold. on the card. Other
and Exercise
of Employment
Rights Survey
DTI Fair Interviewer- Quota sampling (part of an Please look at this card. Straight or heterosexual n/a 87.8 2.0 0.4 2.4 0.7 n/a 90.9 n/a n/a 9.0 n/a 9.0 2704 (w)
Treatment At administered omnibus survey); Which of the answers on Gay or lesbian or homosexual
Work Pilot (CAPI; employees now/in last 2 the card best describes Bisexual
Survey showcard) years; 1 person/hhold. you? Please just give Other
me the letter alongside
the appropriate answer.
National Interviewer- Simple random sample of Which of the categories Heterosexual (like people of 31 dk dk dk dk n/a trans- dk dk n/a dk n/a 0 <400 (u)
Mental Health administered patients in each of 41 on this card would you the opposite sex) gender:
and Ethnicity (PAPI; NHS/private hospitals say describes your Gay male or lesbian female dk
Census 2005 showcard) sexual orientation? (like people of the same sex)
Service User Please give me the Bisexual (like people of both
Survey letter. sexes)
Transgender
National Interviewer- Census of inpatients in all Which of the following Lesbian/Gay/Homosexual dk dk dk dk dk dk n/a dk dk dk n/a n/a 0 dk
Mental Health administered mental health/learning terms would you use to Straight/Heterosexual
and Learning (PAPI; disability establishments, describe your sexual Bisexual
Disability showcard) England and Wales orientation? Other
Ethnicity Do not wish to answer
Census 2006
British Social Self- Random probability; general Which of the following Heterosexual (‘straight’) dk 95.3 gay: 0.4 0.6 1.4 can't trans- 98.6 n/a n/a n/a 1.4 1.4 1732 (u)
Attitudes administered population; 1 person/hhold best describes how you Gay lesbian: 0.4 choose: sex'l:
Survey (paper) after (16>) think of yourself? Lesbian 1.7 0.2
interviewer- PLEASE TICK ONE Bisexual
administered BOX ONLY Transsexual
survey Can’t choose
36
rate %
Survey response
/straight
Heterosexual
gay/lesbian
Homosexual/
Bisexual
Total LGB
similar)
Other
categories
Additional
categories
substantive
Total % in
DK
(presented)
not to answer
Do not wish/Prefer
code)
Ref/no answer (int
SC
Unanswered on
values
Total % missing
(w)=weighted
(u)=Unweighted
Survey Mode of Sample Question stem Response Categories
administ-
ration
(or
Newham Self- Random probability; general Please tell us what best I am heterosexual or 'straight' W1: 74 W1: 74.5 W1: 0.0 W1: W1: 0.3 none of n/a W1: n/a W1: n/a W1: 9.5 W1: W1: 1286
Household administered population (Newham); all describes you. I am 'gay' or 'lesbian' W2: 82 W2: 75.2 W2: 0.7 0.3 W2: 1.1 the 76.8 13.7 W2: 8.8 23.2 W2: 1218
Panel Survey (paper) after household members (16>) (homosexual) W3: 78 W3: 70.4 W3: 0.7 W2: W3: 1.3 above: W2: W2: W3: 10.0 W2: W3: 1132
interviewer- I am bisexual W4: 44 W4: 71.7 W4: 1.0 0.4 W4: 2.1 W1: 2.0 79.5 11.7 W4: 9.5 20.5 W4: 636
4
administered If none of the above applies. W3: W2: 3.3 W3: W3: W3: (u)
survey (PLEASE WRITE): I 0.6 W3: 3.4 75.0 15.0 25.0
am_______________ W4: W4: 4.7 W4: W4: W4:
I do not wish to answer this 1.1 78.5 12.0 21.5
question
Scottish Self- Random probability; general Which of the following Heterosexual 31 83.1 0.9 0.4 1.3 0.9 n/a 85.3 n/a 8.5 n/a 6.2 14.7 692 (u)
Census Small administered population (Scotland); 1 best describes your Lesbian/gay (versions
Scale Test (postal) person/hhold. Half sample sexual orientation? Bi-sexual with SO
asked sexual orientation Other Q)
question. Prefer not to answer
Civil Service Self- Simple random sample of If you have no objections Bisexual 48 89.0 gay: 1.0 2.0 3.0 n/a n/a 92.0 n/a n/a n/a 8.0 8.0 7863 (u)
Diversity administered civil servants from every to stating your sexuality, Gay man (rounded) lesbian 0.0
Survey (postal) department, plus boosts. please could you state it Heterosexual
here? Lesbian
Omnibus (July- Self- Random probability; general Which of the following Heterosexual 92.0 1.3 1.2 2.5 0.9 n/a 95.4 n/a 4.6 n/a n/a 4.6
Aug06) (exc administered population; 1 person/hhold best describes your Lesbian/Gay
15% skipped) (CASI) within (16>) sexual identity? Bi-sexual
interviewer- Other
administered Prefer not to say
survey
Omnibus (Nov- ditto ditto Do you consider yourself Heterosexual or Straight 96.8 0.8 0.6 1.4 0.3 n/a 98.5 n/a 1.5 n/a n/a 1.5
Dec06) (exc to be… Gay or Lesbian
14% skipped) Bisexual
Other
Prefer not to say?
Adult Self- Random probability; general Version A: Entirely heterosexual (attracted to persons of the opposite sex) still in
Psychiatric administered population (Eng); 1 Which statement best Mostly heterosexual, some homosexual feelings field
Morbidity (CASI) within person/hhold (16>). Split describes your sexual Bisexual (equally attracted to men and women)
interviewer- sample experiment orientation? This means Mostly homosexual, some heterosexual feelings
administered sexual feelings, whether Entirely homosexual (attracted to persons of the same sex)
survey or not you have had any Other
sexual partners.
Version B: Completely heterosexual still in
Please choose the Mainly heterosexual field
answer below that best Bisexual
describes how you think Mainly gay or lesbian
of yourself... Completely gay or lesbian
Other
Citizenship Interviewer- Random probability; general Looking at this card, Heterosexual or straight still in
Survey administered population; 1 person/hhold which of the options best Gay or lesbian field
(CAPI; describes your sexual Bisexual
showcard) identity? Please just Other
read out the letter next to [or would you] Prefer not to say?
the description.
4
DK survey rate. % of completed individual interviews returning self- completion booklet. Estimates and bases for the sexuality question are for those who returned self-completion booklet only.
37
Appendix B: Information about each survey included in the review
38
drew attention to the question and made some people
feel embarrassed; thus it was dropped for the main stage
(beginning 2000).
• After piloting they discarded various versions of the
question which did not give the options in both formal
and informal language. Interviewers felt that some
people weren't quite sure what heterosexual meant, and
confused it with homosexual. The options now used
include 'I am heterosexual or straight'.
Feedback from researchers, • The researchers had no feedback from the fieldwork
interviewers, respondents agency on acceptability and comprehension of the
question by respondents.
• No known explanation for there being no 'other'
responses being given in 2005.
• The phrase 'I do not wish to answer this question' was
thought to be clearer for respondents than 'Refused',
having less negative nuances.
• Interviewers thought that it was less embarrassing for
respondents if they were given a show card than if the
responses were read out by the interviewer. Also,
respondents only give the associated numbers, rather
than the phrase on the card, which again cuts down
potential embarrassment.
Links http://www.ark.ac.uk/nilt/
http://www.esds.ac.uk/government/nilts/
39
1.2 Policing For London Survey
40
Base =8293 =2800
Non-response / non- No 'prefer not to say' category was offered. Interviewers could
substantive responses code ‘don’t know’ and ‘not answered’.
Pre-testing, piloting, changes -
made over time
Feedback from researchers, • Technical report didn't indicate any particular issues
interviewers, respondents about the question on 'sexual orientation'.
• Mike Hough: “by any reasonable standards the results are
pretty hopeless… baldly unsatisfactory.”
• One interviewer, contacted by NatCen for this ONS
review, commented that there were problems with
comprehension, including among people with limited
English, but that there were no problems with privacy,
confidentiality or acceptability: "Respondents often
misunderstood the question. Many laughed and replied
'None of these!', or in several cases 'None of those - just
'Normal'!'. Others chose an answer, but when probed, it
became apparent that they had misunderstood the
meaning of the terms, and had therefore given an
answer which did not describe them accurately....In
London, many respondents have a limited working
knowledge of the English language, which does not
extend to polysyllabic technical terms of Greek and Latin
origin. The only way to obtain reliable data from them is
to use simpler words."
Links Technical report: www.blink.org.uk/docs/policereport.pdf
41
1.3 Employees’ Awareness Knowledge and Exercise of Employment Rights Survey
Sponsor DTI
Information provided by DTI
Year conducted 2005
Survey mode CAPI
Sexual identity question CAPI (show card).
mode
Sample design Random probability (PAF). Population: individuals of
working age (16-64 for men and 16-59 for women) who
were employees or had been employees in the previous two
years. One person selected per household.
Survey topics Various issues relating to employment rights
Survey response information 58%
Sexual identity question A 'socio-demographic' sexual identity question was included
purpose in order to examine the experiences of LGB employees
compared to heterosexual / straight employees, in respect of
employment rights and fair treatment.
Location in Last substantive question (followed by a recontact question).
questionnaire/interview
Other SID-related questions Relating to awareness of employer’s obligation re. unfair
treatment and whether employee had been unfairly treated
Sexual Identity question "Which of these best describes you? - please just give me
the number on the card." (Showcard categories numbered 1
to 4).
1. Straight/heterosexual
2. Gay/lesbian/homosexual
3. Bisexual
4. Other
Explanations/definitions/assu Prompt for interviewer use if respondents objected to or
rances provided to queried the question:
respondent "We’re collecting this information to find out more about the
discrimination people may face for different reasons. The
information is kept completely anonymous."
Interviewer instructions Briefed (by video) and received written instructions.
Estimates (weighted % All Valid
unless otherwise indicated) Straight/heterosexual 98.9 99.1
Gay/lesbian/homosexual 0.7 0.7
Bisexual 0.2 ?
Other 0.1 ?
Refused 0.2 -
Base =1038
Non-response / non- A 'refusal' category was available to interviewers. It was not
substantive responses shown on the card in order to maximise response to the
question.
Pre-testing, piloting, changes -
made over time
Feedback from researchers, • DTI researchers hypothesised that the clustered design
interviewers, respondents might have led to under-representation of LGB people,
if they are concentrated in certain urban areas.
• For each interview the interviewer was asked "Was
there any reluctance or objection from the respondent
in answering this question?" To which the responses
were Yes: 2%; No: 98%. The fieldwork contractor
reported after piloting that "everyone was happy to
answer the sexual orientation question - the show card
helped lessen any potential embarrassment for
respondents and interviewers asking the question".
Links
42
1.4 Fair Treatment at Work Pilot Survey
Sponsor DTI
Information provided by DTI
Year conducted 2005
Survey mode CAPI
Sexual identity question CAPI (concealed response show card).
mode of administration
Sample design Quota sampling. Administered as part of an omnibus survey.
Population: adults in work as employees or who had worked
as employees in the previous two years. One person was
selected per household.
Survey response information Response rate not applicable (quota sample). 2366
interviews achieved in main survey plus 1570 from boosts.
Survey topics Perception of unfair treatment in workplace; reporting of
personal and observed unfair treatment
Sexual identity question A 'socio-demographic' sexual identity question was included
purpose in order to examine the experiences of LGB employees
compared to heterosexual / straight employees, in respect of
employment rights and fair treatment.
Location in Last substantive question (followed by a recontact question).
questionnaire/interview
Other SID-related questions Awareness of/personal experience of unfair treatment due to
sexual orientation.
Sexual Identity question "Please look at this card. Which of the answers on the card
best describes you? Please just give me the letter
alongside the appropriate answer." (Show card categories
labelled R, D, P, H).
R: Straight or heterosexual
D: Gay or lesbian or homosexual
P: Bisexual
H: Other
Explanations/definitions/assu Prompt for interviewers to use if respondents objected to or
rances provided to queried the question:
respondent "We’re collecting this information to find out more about the
discrimination people may face for different reasons. The
information is kept completely anonymous."
Interviewer instructions Interviewers received written instructions. No briefing.
Estimates (weighted % All Valid
unless otherwise indicated) Straight/heterosexual 87.8 96.5
Gay/lesbian/homosexual 2.0 2.2
Bisexual 0.4 ?
Other 0.7 0.7
Refused 9.0 -
Base =2704
Non-response / non- A 'refusal' category was available to interviewers. It was not
substantive responses shown on the card in order to maximise response to the
question.
Pre-testing, piloting, changes On the pilot, self-completion was also tried, using a CAPI
made over time pen, which was reported to have been "slightly preferred" by
respondents as "more anonymous". But it broke up the
interview and was a struggle for certain respondents. The
show card version was "not met with any resistance",
though it was said that interviewers knew the answer so it
was not confidential. However, it appears that all pilot
respondents were straight - different views might be
expressed by LGB respondents.
Feedback from researchers, • The Fair Treatment at Work Survey asked "Was there
interviewers, respondents any reluctance hesitation at all when answering the
earlier question about sexual orientation?" To which the
43
responses were Yes: 10%; No: 89%; Don't know: 1%.
• The "considerably higher levels" of refusal to answer,
and of interviewer perceptions of reluctance or
objection, have been ascribed by DTI to "a small
number of interviewers" coding a refusal for the
majority or all of their interviews. The refusals did not
occur at random, suggesting "that some interviewers do
not feel able to ask this question and record a refusal,
instead of asking the question and recording the correct
response." It is worth noting that those interviewers
received no briefing. DTI conclude that "Future
attempts to ask a question on sexual identity must pay
particular attention to interviewer training and briefing to
ensure interviewers feel able to ask the question."
Links
44
1.5 Count Me In: The National Mental Health and Ethnicity Census 2005 Service User Survey
See also 1.6 below regarding experiences of the 2006 NHME Census itself.
45
1.6 Count Me In: The National Mental Health and Learning Disability Ethnicity Census 2006
46
• Patients did not have to reply.
• It explained difference between orientation and activity.
Estimates (weighted % Full results not published. 2.1% of all records were coded LG
unless otherwise indicated) or B. But many providers (nearly 50%) did not return any
records coded LGB – they might not have asked the question.
Of those providers who returned any LGB cases, the rate was
3.1%. They ranged between under 1% to over 10%. Highest
proportions were in private and voluntary establishments
rather than NHS (were felt to have a better environment).
Non-response / non- There was no category for ‘not asked’ (will be from 2007) but
substantive responses a value had to be entered somewhere. So ‘other’ might have
included the cases where the question wasn’t asked, as well
as genuine others/don’t knows.
Pre-testing, piloting, changes -
made over time
Feedback from researchers, • No information on feedback from patients.
interviewers, respondents • Problems were encountered with many providers
administering the Census not asking the question.
Providers were given a feedback questionnaire; only a
third responded to a question on the % of patients who
had been asked the SO question; of which 63% said
they had asked 75>% of patients; 14% had asked <25%.
• MHAC conducted road shows with providers to discuss
issues about conducting the Census. There were some
“shocking” findings including views that the question
didn’t need to be asked, you could tell just by looking if
someone was gay (no information on how often it was
answered by providers rather than asked of patients);
and how can staff be expected to ask - it was too
embarrassing. But there was also positive feedback from
the establishments which had administered the question
to more people.
Links www.mhac.org.uk/census
47
2 SELF-ADMINISTERED QUESTIONS IN INTERVIEWER-ADMINISTERED SURVEYS
48
2.2 Newham Household Panel Survey
49
Wave 4: 21.5
Pre-testing, piloting, changes A preamble was formerly used, but dropped from Wave 4. It
made over time read "Some people have a partner of the same sex as
themselves (homosexual) while others have a partner of the
opposite sex (heterosexual)."
Feedback from researchers, • ‘none of the above’ – verbatim responses not available.
interviewers, respondents • None from interviewer debriefing or from respondents.
No apparent problems.
• In all waves ‘white’ respondents were significantly more
likely to answer the SID question (1/7 did not answer),
‘Asian’ (3/10) and ‘other’ (base too small) ethnic groups
significantly less likely.
• No analysis of individual responses from wave to wave
was available.
Links http://www.newham.info/research/NHPS.htm
Survey Estimates
% of all individual % of all cases % of valid cases
Wave 1
interviews where s/c returned
Heterosexual or straight 55.3 74.5 97.0
Gay or lesbian (homosexual) .0 .0 .0
Bisexual .2 .3 .4
None of the above 1.5 2.0 2.6
Do not wish to answer this question 10.1 13.7 -
Total answering question N=988
Not answered on s/c 7.0 9.5 -
Total returning s/c form 74.2 N=1286
Self completion not returned 25.8
Total individual interviews N=1733
50
% of all individual % of all cases % of valid cases
Wave 4
interviews where s/c returned
Heterosexual or straight 31.4 71.7 91.2
Gay or lesbian (homosexual) .4 1.0 1.2
Bisexual .5 1.1 1.4
None of the above 2.1 4.7 6.0
Do not wish to answer this question 5.3 12.0 -
Total answering question N=500
51
3 SELF-ADMINISTERED - POSTAL SURVEYS
Response information 31% (biased towards respondents aged over 30, females,
and those in less deprived areas, suggesting response
cannot be generalised to the whole population). Response
rates to the variants were very similar.
52
Estimates (weighted % Unweighted Total Both No
unless otherwise indicated) data % Qs discrimination
Q
Heterosexual 83.1 82.7 83.5
Lesbian/gay 0.9 1.4 0.3
Bi-sexual 0.4 0 0.9
Other 0.9 0.6 1.2
Prefer not to 8.5 9.2 7.8
answer
Non response 6.2 6.1 6.4
Base 692 346 346
Non-response / non- • 'Prefer not to answer' and item non-response far
substantive responses outweighed response to the LGB categories.
• Item non-response is higher than for religion and other
questions (but lower than for caring, the next question
on the form - a possible order effect?).
Pre-testing, piloting, changes -
made over time
Feedback from researchers, •A 'specify' box for other was not included, considered
interviewers, respondents relevant only to a small number of respondents and a
potential ‘magnet’ for facetious responses.
• The conclusion of the author was that the "accuracy"
and “utility" of the data is questionable.
• The researchers conclude that the "absence of a
difference in response rates between the variants ...
may indicate that the most compliant sections of the
population ...are not put off by the inclusion of a sexual
orientation question."
• At the end of the form respondents were asked to
indicate how happy they felt about each question. "The
question on sexual orientation stands out with around
11% of respondents selecting very unhappy or
unhappy" compared to less than 5% for most
questions.
• Respondents could also write comments. For the
sexual orientation question they "ranged from fully
accepting to hostile".
• People who "accepted the question without
reservation" included people answering 'prefer not to
say'.
• Concerns expressed included:
o "not understanding the need for the data";
o "offence" and "unhappiness at the intrusive
nature of the question"; and concerns about
confidentiality (e.g. being asked for their
postcode). They did not necessarily "translate
into refusal to record an answer" [to the
question].
• Some respondents' comments "suggested they had
possibly misunderstood or misinterpreted the question".
These included an elderly woman who answered the
question with a sexual orientation but did not think the
question applied to her, and another who had to think
what the question meant but answered as best she
could.
• Non-responders to the question "generally also
declined to record any comments".
Links Report: http://www.gro-
scotland.gov.uk/census/censushm2011/question-
development/sexual-orientation-in-the-census.html
53
3.2 Civil Service Diversity Survey
54
4. SURVEYS IN FIELD DURING TIME OF REVIEW – NO ESTIMATES AND LIMITED FEEDBACK
AVAILABLE
Sexual identity question Designed to compare two versions of the sexual identity
purpose; definition of question (see below). Version assigned at random during
SexID/orientation course of interview (CASI controlled).
VERSION A
A1. Sexori
Which statement best describes your sexual orientation?
This means sexual feelings, whether or not you have had any
sexual partners. "
55
Other
A2. Sexpart
Have your sexual partners been...
/ "Sexual partners"
:
only opposite sex",
mainly opposite sex but some same sex partners",
mainly same sex but some opposite sex partners",
only same sex",
or, I have not had a sexual partner")
VERSION B
B1. Sexdes
Please choose the answer below that best describes how you
Think of yourself...:
completely heterosexual
mainly heterosexual
bisexual
mainly gay or lesbian
completely gay or lesbian
Other
B2. SexPart2
Sexual experience is any kind of contact with another person
that you felt was sexual (it could be just kissing or touching, or
intercourse, or any other form of sex).
Has your sexual experience been..."
:
Only with women/men (or a woman/man), never with a
man/woman,
More often with women/men, and at least once with a
man/woman,
About equally often with women/men and men/women,
More often with men/women, and at least once with a
woman/man,
Only with men/women (or a man/woman), never with a
woman/man,
I have never had any sexual experience with anyone at all"
56
4.2 Citizenship Survey
Sexual identity question Requested by Women and Equality Unit and others: Will
purpose; definition of provide an indicator of the respondents self-reported sexual
SexID/orientation orientation/identity.
Location in In the middle of the second of two demographics sections,
questionnaire/interview towards the end of the questionnaire. Preceded by: media
consumption (newspapers, radio, TV, internet); vehicle
ownership; health; and caring. Followed by: employment
status; main job; qualifications; and income (banded).
The first demographics section, at the start of the
questionnaire, covers: sex; marital status; tenure; ethnicity;
nationality; whether employed.
Other SexID-related A question on personal identities: "Suppose you were
questions describing yourself, which of the things on this card would say
something important about you?"
The 16 response categories include work, family, income and
several identities including ethnicity, religion, social class,
gender and "your sexuality". The question is multicoded,
followed by one asking which is "the most important thing'.
Sexual Identity question (Asked if interview is not being translated by family member
or friend)
W …Heterosexual or straight
P … Gay or lesbian
H … Bisexual
S …Other
G … [or would you] Prefer not to say?
Ask if =Other
Explanations/definitions/assu Preamble:
rances provided to The next question is about sexual identity. We are asking this
respondent question because the government department funding this
study, Communities and Local Government, is responsible for
helping to reduce all forms of prejudice and discrimination in
society.
Interviewer
57
instructions/briefing
Estimates (weighted % Not yet available
unless otherwise indicated)
Non-response / non- Not yet available
substantive responses
Pre-testing; piloting; changes Cognitive testing conducted by NatCen, with regard to
made over time respondent reactions, objections and comprehension.
Question tested:
Please read out the letter which best describes your sexual
identity?
C. …Heterosexual or straight
B. ... Gay or lesbian
D. ... Bisexual
A. ...Other
E ...[or would you] Prefer not to say?
58