As one who has encouraged the city to get behind the use of social engagement and other such tools to facilitate public dialogue, I am somewhat sympathetic to the city here. One the other hand, pumping up survey results to bolster public policy in a town like Davis is fraught with risk.
The Vanguard has long supported a containerization of green waste to avoid the unsightly, messy, and potentially hazardous dumping of waste onto the sides of streets where it gets swept into storm drains and presents added hazards to bikes.
Today, City Manager Dirk Brazil has an op-ed on the other side of the street where he explains the programmatic changes that may occur, which include an organic cart picked up with trash and recycling once a week, and yard waste piles collected monthly, except October through mid-December when they’ll be collected weekly.
All of this is fine, however, then he writes: “The city used Davis Together::Engage, a survey tool, to gauge customer feelings and solicit feedback on the proposed program. Of the 341 residents who took the survey, 58 percent of customers indicated that the level of service in the proposed program would meet their needs and 82% stated that they thought collecting and composting organics would be beneficial to the City.”
The city’s use of the “Davis Together::Engage” however convinces distinguished professor Arthur Shapiro to write that perhaps it is true that most residents are all right with the yard waste bins. He writes that, based on what we know, we have “no reason to believe that.”
He argues, “To assess whether a sample was adequate and to estimate a survey’s margin of error, one needs to know the survey’s design. But this ‘survey’ isn’t really a survey in any valid statistical sense, and it has no apparent design.”
The problem starts here: “How can we tell whether the 335 respondents adequately represent community opinion? They are self-selected, which is to say likely to be highly motivated on the issue.”
He continues, “What percentage of Davis households knew that a survey was being conducted? For that matter, what percentage of Davis households have ever heard of the ‘Davis Together :: Engage social media tool,’ let alone have accessed it? And how did they learn about it?”
Professor Shapiro concludes, “Anyone with any experience in survey design and implementation can immediately recognize the ‘results’ reported in your story as worthless. Surely America’s most-educated city can do better than this.”
Another writer, Neil Rubenking, makes the same point, “If you have a random sample of 300-odd from a population of 50,000 (Davis residents over 18) then you can draw conclusions with a reasonable expectation of accuracy.”
He continues, “But this survey is absolutely the opposite of random. Any residents who don’t choose to go online were excluded. Any who found the ‘Together::Engage’ page to be impossibly rah-rah were excluded. Any who spend all their time online but not on local sites were excluded. The sample responders were totally self-selected.”
He concludes, “All conclusions that imagine this survey to represent Davis residents are patently flawed. If this is in any way unclear, please find a stats prof at UCD and get confirmation. I would hope for a retraction of all statements that imply the survey represents what Davisites think. (For what it’s worth, I did complete the survey).”
While the critics here are of course correct – we need to be more careful with how we present and use non-random surveys of public opinion, especially when we use them to support public policy – I would stop short of stating that this is meaningless.
However, I would treat this more like public comment. What does public comment tell us? Well first of all, public commenters tell us who the motivated people are who will speak out on a given issue.
Second, public commenters may be able to identify issues and problems that staff and council had not considered.
What public comment does not tell us is the direction or distribution of opinion within the broader community. It is not a poll or an election. And, therefore, it should not be used as such.
One of the critical questions we have to ask with the survey tool is whether it is possible for someone to vote twice. I remember some time ago that the Enterprise used to have reader polls. That worked until they had a very controversial question and people figured out how they could get around the site’s cookies to manipulate the poll.
My take away from this is that using these engagement tools as a guide is fine. But I would suggest – particularly in a highly educated community with people well-versed on survey research methodology ‒ that the city avoid presenting the results like they would a poll.
Dirk Brazil in his op-ed addresses a few other points. A number of people have expressed concerns that they do not have room for a third cart. He writes, “In addition to the standard 95-gallon carts, Davis Waste Removal will provide 35 and 65-gallon carts to customers upon request. Using the recycling and organics carts will greatly reduce what is placed into the garbage cart. This will allow some customers to reduce the size of their garbage cart which will free up space for the organics cart.”
As we have noted, many other communities have three carts and have somehow made it work – many of them have had such programs in place for a decade or two.
Mr. Brazil also indicates that “the City is also looking into ‘opt-out,’ ‘share-a-can’ and variable rate charges for the organics program.”
—David M. Greenwald reporting