1.
Abrajano and Lajevardi 2021 "(Mis)Informed: What Americans Know About Social Groups and Why it Matters for Politics" reported (p. 34) that:
We find that White Americans, men, the racially resentful, Republicans, and those who turn to Fox and Breitbart for news strongly predict misinformation about [socially marginalized] social groups.
But their research design is biased toward many or all of these results, given their selection of items for their 14-item set of misinformation items. I'll focus below on left/right political bias, and then discuss apparent errors in the publication.
---
2.
Item #7 is a true/false item:
Most terrorist incidents on US soil have been conducted by Muslims.
This item will code as misinformed some participants who overestimate the percentage of U.S.-based terror attacks committed by Muslims, but won't code as misinformed any participants who underestimate that percentage.
It seems reasonable to me that persons on the political Left will be more likely than persons on the Right to underestimate the percentage of U.S.-based terror attacks committed by Muslims and that persons on the political Right will be more likely than persons on the Left to overestimate the percentage of U.S.-based terror attacks committed by Muslims, so I'll code this item as favoring the political Left.
---
Four items (#11 to #14) ask about Black/White differences in receipt of federal assistance, but phrased so that Whites are the "primary recipients" of food stamps, welfare, and social security.
But none of these items measured misinformation about receipt of federal assistance as a percentage. So participants who report that the *number* of Blacks who receive food stamps is higher than the number of Whites who receive food stamps get coded as misinformed. But participants who mistakenly think that the *percentage* of Whites who receive food stamps is higher than the percentage of Blacks who receive food stamps do not get coded as misinformed.
Table 2 of this U.S. government report indicates that, in 2018, non-Hispanic Whites were 67% of households, 45% of households receiving SNAP (food stamps), and 70% of households not receiving SNAP. Respective percentages for Blacks were 12%, 27%, and 11% and for Hispanics were 13.5%, 22%, and 12%. So, based on this, it's correct that Whites are the largest racial/ethnic group that receives food stamps on a total population basis...but it's also true that Whites are the largest racial/ethnic group that does NOT receive food stamps on a total population basis.
It seems reasonable to me that the omission of percentage versions of these three public assistance items favors the political Left, in the sense that persons on the political Left are more likely to rate Blacks higher than Whites than are persons on the political Right, or, for that matter, Independents and moderates, so that these persons on the Left would presumably be more likely than persons on the Right to prefer (and thus guess) that Whites and not Blacks are the primary recipients of federal assistance. So, by my count, that's at least four items that favor the political Left.
---
As far as I can tell, Abrajano and Lajevardi 2021 didn't provide citations to justify their coding of correct responses. But it seems to me that such citation should be a basic requirement for research that codes responses as correct, except for obvious items such as, say, who the current Vice President is. A potential problem with this lack of citation is that it's not clear to me that some responses that Abrajano and Lajevardi 2021 coded as correct are truly correct or at least are the only responses that should be coded as correct.
Abrajano and Lajevardi 2021 coded "Whites" as the only correct response for the "primary recipients" item about welfare, but this government document indicates that, for 2018, the distribution of TANF recipients was 37.8% Hispanic, 28.9% Black, 27.2% White, 2.1% multi-racial, 1.9% Asian, 1.5% AIAN, and 0.6% NHOPI.
And "about the same" is coded as the only correct response for the item about the "primary recipients" of public housing (item #14), but Table 14 of this CRS Report indicates that, in 2017, 33% of public housing had a non-Hispanic White head of household and 43% had a non-Hispanic Black head of household. This webpage permits searching for "public housing" for different years (screenshot below), which, for 2016, indicates percentages of 45% for non-Hispanic Blacks and 29% for non-Hispanic Whites.
Moreover, it seems suboptimal to have the imprecise "about the same" response be the only correct response. Unless outcomes for Blacks and Whites are exactly the same, presumably selection of one or the other group should count as the correct response.
---
Does a political bias in the Abrajano and Lajevardi 2021 research design matter? I think that the misinformation rates are close enough so that it matters: Figure A2 indicates that the Republican/Democrat misinformation gap is less than a point, with misinformed means of 6.51 for Republicans and 5.83 for Democrats.
Ironically, Abrajano and Lajevardi 2021 Table A1 indicates that their sample was 52% Democrat and 21% Republican, so -- on the "total" basis that Abrajano and Lajevardi 2021 used for the federal assistance items -- Democrats were the "primary" partisan source of misinformation about socially marginalized groups.
---
NOTES
1. Abrajano and Lajevardi 2021 (pp. 24-25) refers to a figure that isn't in the main text, and I'm not sure where it is:
When we compare the misinformation rates across the five social groups, a number of notable patterns emerge (see Figure 2)...At the same time, we recognize that the magnitude of difference between White and Asian American's [sic] average level of misinformation (3.4) is not considerably larger than it is for Blacks (3.2), nor for Muslim American respondents, who report the lowest levels of misinformation.
Table A5 in the appendix indicates that Blacks had a lower misinformation mean than Muslims did, 5.583 compared to 5.914, so I'm not sure what the aforementioned passage refers to. The passage phrasing refers to a "magnitude of difference", but 3.4 doesn't seem to refer to a social group gap or to an absolute score for any of the social groups.
2. Abrajano and Lajevardi 2021 footnote 13 is:
Recall that question #11 is actually four separate questions, which brings us to a total of thirteen questions that comprise this aggregate measure of political misinformation.
Question 11 being four separate questions means that there are 14 questions, and Abrajano and Lajevardi 2021 refers to "fourteen" questions elsewhere (pp. 6, 17).
Abrajano and Lajevardi 2021 indicated that "...we also observe about 11% of individuals who provided inaccurate answers to all or nearly all of the information questions" (p. 24, emphasis in the original), and it seems a bit misleading to italicize "all" if no one provided inaccurate responses to all 14 items.
3. Below, I'll discuss the full set of 14 "misinformation" items. Feel free to disagree with my count, but I would be interested in an argument that the 14 items do not on net bias results toward the Abrajano and Lajevardi 2021 claim that Republicans are more misinformed than Democrats about socially marginalized groups.
For the aforementioned items, I'm coding items #7 (Muslim terror %), #11 (food stamps), #12 (welfare), and #14 (public housing) as biased in favor of the political Left, because I think that these items are phrased so that the items will catch more misinformation among the political Right than among the political Left, even though the items could be phrased to catch more misinformation among the Left than among the Right.
I'm not sure about the item about social security (#13) , so I won't code that item as politically biased. So by my count that's 4 in favor of the Left, plus 1 neutral.
Item #5 seems to be a good item, measuring whether participants know that Blacks and Latinos are more likely to live in regions with environmental problems. But it's worth noting that this item is phrased in terms of rates and not, as for the federal assistance items, as the total number of persons by racial/ethnic group. So by my count that's 4 in favor of the Left, plus 2 neutral.
Item #1 is about the number of undocumented immigrants in the United States. I won't code that item as politically biased. So by my count that's 4 in favor of the Left, plus 3 neutral.
The correct response for item #2 is that most immigrants in the United States are here legally. I'll code this item as favoring the political Left for the same reason as the Muslim terror % item: the item catches participants who overestimate the percentage of immigrants here illegally, but the item doesn't catch participants who underestimate that percentage, and I think these errors are more likely on the Right and Left, respectively. So by my count that's 5 in favor of the Left, plus 3 neutral.
Item #6 is about whether *all* (my emphasis) U.S. universities are legally permitted to consider race in admissions. It's not clear to me why it's more important that this item be about *all* U.S. universities instead of about *some* or *most* U.S. universities. I think that it's reasonable to suspect that persons on the political Right will overestimate the prevalence of affirmative action and that persons on the political Left will underestimate the prevalence of affirmative action, so by my count that's 6 in favor of the Left, plus 3 neutral.
I'm not sure that items #9 and #10 have much of a bias (number of Muslims in the United States, and the country that has the largest number of Muslims), other than to potentially favor Muslims, given that the items measure knowledge of neutral facts about Muslims. So by my count that's 6 in favor of the Left, plus 5 neutral.
I'm not sure what "social group" item #8 is supposed to be about, which is about whether Barack Obama was born in the United States. I'm guessing that a good percentage of "misinformed" responses for this item are insincere. Even if it were a good idea to measure insincere responses to test a hypothesis about misinformation, I'm not sure why it would be a good idea to not also include a corresponding item about a false claim that, like the Obama item, is known to be more likely to be accepted among the political Left, such as items about race and killings by police. So I'll up the count to 7 in favor of the Left, plus 5 neutral.
Item #4 might reasonably be described as favoring the political Right, in the sense that I think that persons on the Right would be more likely to prefer that Whites have a lower imprisonment rate than Blacks and Hispanics. But the item has this unusual element of precision ("six times", "more than twice") that isn't present in items about hazardous waste and about federal assistance, so that, even if persons on the Right stereotypically guess correctly that Blacks and Hispanics have higher imprisonment rates than Whites, these persons still might not be sure that the "six times" and "more than twice" are correct.
So even though I think that this item (#4) can reasonably be described as favoring the political Right, I'm not sure that it's as easy for the Right to use political preferences to correctly guess this item as it is for the Left to use political preferences to correctly guess the hazardous waste item and the federal assistance items. But I'll count this item as favoring the Right, so by my count that's 7 in favor of the Left, 1 in favor of the Right, plus 5 neutral.
Item #3 is about whether the U.S. Census Bureau projects ethnic and racial minorities to be a majority in the United States by 2042. I think that it's reasonable that a higher percentage of persons on the political Left than the political Right would prefer this projection to be true, but maybe fear that the projection is true might bias this item in favor of the Right. So let's be conservative and count this item as favoring the Right, so that my coding of the overall distribution for the 14 misinformation items is: seven items favoring the Left, two items favoring the Right, and five politically neutral items.
4. The ANES 2020 Time Series Study has similar biases in its set of misinformation items.