Re: [119all] Result of the IETF 119 Brisbane post-meeting survey

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 




--On Monday, April 15, 2024 18:07 -0400 Christian Hopps
<chopps@xxxxxxxxxx> wrote:

> 
> John Scudder <jgs@xxxxxxxxxxx> writes:
> 
>> Hi Chris,
>> 
>> On Apr 14, 2024, at 7:23 AM, Christian Hopps <chopps@xxxxxxxxxx>
>> wrote:
>>> forced/unwanted remote participation
>> 
>> It occurs to me I don't know what you mean by this phase.
>> Obviously, nobody is literally being forced to participate
>> remotely. I guess you're probably talking about the responses to
>> Q5 other than "it is my preferred way to participate"?
> 
> Yes, from the survey:
> 
> "For the third time, we asked people who participated remotely
> (Q5), why they did and if they would have preferred to participate
> onsite (Q5a). Once again, the major factor, cited by 75% of people,
> was the lack funding to travel."

Chris, John,

Sequences like the above arewhy I keep arguing that one has to be
very careful about how survey answers are interpreted and how much is
read into them.. and often about exactly how questions are asked and
what alternatives are given.  The latter comes with the understanding
that it is sometimes not possible to do better.

For this case, and using myself as an example... While I'm an oddity,
not least because I have not had organizational funding for IETF
participation for years, I've participated remotely and have answered
that question with "yes, I would have preferred to be onsite" and
"lack of funding".  Same answer for San Francisco, Prague, and
Brisbane; little no information in the survey answer about the
relative differences in costs to me or what secondary factors might
have contributed to my deciding to not go and/or not ask for a fee
waiver (a small fraction of the cost anyway).  And, because some of
those factors include what I consider sensitive information, if there
had been detailed questions about them, I probably would not have
answered.  

Maybe I'm unique in those ways, but I'd be really surprised.

So, yes, the question is good to ask and the data are useful.  And
changes in the ratios of onsite to remote attendees or the percentage
who respond "yes" are at least interesting and may give an indication
of something to watch and think about.  But drawing strong inferences
from them -- especially when there is reason to suspect that the
collection of people who respond to the survey are not a random or
representative sample of the IETF community or even IETF meeting
participants (and that completeness and responsiveness to answers to
one question may be representative of different opportunity samples
than those to another)-- is what my colleagues in the professional
statistics and survey businesses consider highly suspect and
questionable.

In case someone wonders, yes, I could put on my long-shelved survey
designer and statistician hat and make some suggestions about how to
make questions like that and the answers more precisely and reliably
interpretable.  But doing so would come at the cost of a longer and
more complicated survey instrument and questions that at least some
would find more intrusive.  Either or both might easily discourage
responses and make the profiles of those who respond less
representative than even the current questionnaire and process.   No
easy solutions to those problems.  The closest one is that other than
all of us become more careful about how much we read into the numbers.

thanks,
   john





[Index of Archives]     [IETF Annoucements]     [IETF]     [IP Storage]     [Yosemite News]     [Linux SCTP]     [Linux Newbies]     [Mhonarc]     [Fedora Users]

  Powered by Linux