Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Track the web's top pain points over time #15

Open
captainbrosset opened this issue Nov 21, 2022 · 14 comments
Open

Track the web's top pain points over time #15

captainbrosset opened this issue Nov 21, 2022 · 14 comments

Comments

@captainbrosset
Copy link

Context

This was originally proposed by @una here: #8 (comment)

Can we also address the top pain points from the last survey(s) and see if they are still top pain points or things have gotten better? I think this mapping over time is valuable:
Screenshot from the 2020 MDN NDA survey results, showing how the top 10 pain points had evolved since 2019

The MDN DNA survey ran in 2019 and 2020 and gave us great insights into how web developers experience the web platform.

It got 76K results in 2019 and 31K results in 2020.
Both times, people had to rank 28 "needs" using a MaxDiff study (so, only a few needs were displayed at each step of the survey).
The list of 28 needs were mostly the same across the 2 years, with some differences. The 3 lowest ranking needs from 2019 were dropped in 2020 and replaced with 3 new ones.

The top 10 needs stayed the same over the 2 years, but the order between them changed a bit. Here is the list in 2020:

  1. Having to support specific browsers (e.g., IE11).
  2. Avoiding or removing a feature that doesn’t work across browsers.
  3. Making a design look/work the same across browsers.
  4. Testing across browsers.
  5. Discovering bugs not caught during testing.
  6. Outdated or inaccurate documentation for frameworks and libraries.
  7. Keeping up with a large number of new and existing tools or frameworks.
  8. Managing user data to comply with laws and regulations.
  9. Supporting multiple frameworks in the same code base.
  10. Understanding and implementing security measures.

Running a new survey

Two years have passed, and it seems that knowing the answer to the following questions would be useful:

  • Are the top pain points identified in 2019/2020 still the same today?
  • Did the order change?
  • Are there new pain points?

Furthermore, tracking this over time (for example once a year) seems useful too.

@captainbrosset
Copy link
Author

Some discussions about this happened in another place, let me try to summarize the important points here:

@robnyman, @dontcallmedom, and I talked about potentially doing several types of surveys:

  1. DNA-style surveys, on a yearly (or even less frequent) basis.

    This seems like a useful way to keep track of slow-moving trends. It is hard to do something about the results though as they tend to be quite high-level. This shouldn't be our only mean to find out about the web's pain points, but it can give us a way to learn about the main themes and how they evolve over time.

  2. Compat-style surveys (see compat report 2020), on a yearly basis.

    This is more actionable and help identify areas to work on.

  3. More frequent MDN short surveys (quarterly?)

    The Web DX group already ran a couple of short surveys on MDN thanks to @Rumyra. This can be a good way to quickly find out about specific topics. It's most likely not a good vehicle to track the pain points identified in the DNA surveys though as MDN short surveys are made to be short and basic, with only a few quick questions.

@robnyman
Copy link
Member

In short, I think all of these are interesting and fill a purpose. Also – particular in the current economy – that it would rather come down to resources, people and money, to figure out what we can run and how we prioritize them.

@captainbrosset
Copy link
Author

captainbrosset commented Nov 22, 2022

it would rather come down to resources, people and money, to figure out what we can run and how we prioritize them

Definitely a fair point and MDN short surveys are, currently, the fastest and cheapest way we have to survey developers.

However, these types of surveys can't be compared with the DNA-style surveys. In a short survey, we wouldn't be able to ask users to go through a long and tedious maxdiff sorting process. We could do a simpler ranking question instead, but that has drawbacks: much lower fidelity of the results, and impossibility to know how far each pain point is with regards to the others. Also ranking 28 needs (the full list from the DNA survey) takes time and we'd likely see low participant numbers.

If we were to go with a MDN short survey format, it'd be good to lock down on a short list of pain points and make it quick for people to signal their top point(s). It would also be good to allow for free-text answers.

@foolip
Copy link
Collaborator

foolip commented Nov 25, 2022

The most obvious approach would be to take the 28 needs statements from 2020 and ask developers to rank them using the same question type (max diff) as originally. Unfortunately this takes a lot of time for survey takers, and any change to the list of 28 statements could influence the results.

Another approach is asking "how satisfied are you with X" at many points over time, similar to the Satisfaction by Subcategory from MDN DNA 2020. Unfortunately we expect the responses to change slowly over time, so we'd need a lot of data to detect any real change.

For both of the above, one also has to consider whether the audience is equivalent at every point in time, to avoid mix shift effects.

Finally, we could ask if developers think things have improved in the past 12 months. In order to make sense of the results we'd need to ask about the improvements across a number of things, including things we're confident haven't improved.

These are just my non-researcher ideas.

@captainbrosset
Copy link
Author

Thanks @foolip. Those are great points.
Coincidentally, we had a quick discussion about categorization with @Rumyra and @schalkneethling elsewhere so let me summarize what was said here.

Ruth is proposing to categorize the pain points in larger buckets (testing, documentation, browser compat, etc.) and then testing the buckets one by one using MDN short surveys with a "don't care"/"care some"/"care a lot" approach.
To do this, the grouping found in the DNA satisfaction by subcategory question, which you mentioned, will be useful.

Schalk also proposed to run another short survey with the top-3 of each category so we can also compare cross-bucket.

That's a good point about the mix-shift effect too. I wonder if there's a way we can model the survey to avoid it being too much of a problem.

I'll report here with a complete list of categories and pain points to move this forward.

@captainbrosset
Copy link
Author

I came up with the following categories below. They have an average of 5 items each.
If we were to ask something like "how much frustration do you experience with the following web development activities (none, a bit, a lot)?" then I think we could ask about 2 or 3 categories in just one survey. Going through 10 to 15 activities and checking a radio button for each seems quick enough for an MDN short survey, what do you think?

I'd love to also add something to ask if any new pain points emerged, and Philip's questions about whether things have improved over the past 12 months.

Interoperability
  • Having to support specific browsers (e.g., IE11).
  • Avoiding or removing a feature that doesn’t work across browsers.
  • Making a design look/work the same across browsers.
  • Lack of support for progressive web apps (PWAs).
Testing and debugging
  • Testing across browsers.
  • Running end-to-end tests.
  • Running front-end tests.
  • Discovering bugs not caught during testing.
  • Determining the root cause of a bug.
  • Fixing a bug once it’s been identified.
Documentation and learning
  • Outdated documentation for HTML, CSS and JavaScript.
  • Knowing what browsers support a specific technology.
  • Keeping up with changes to the web platform.
  • Deciding what to learn next to keep my skill set relevant.
  • Finding a community of peers.
Frameworks, libraries, build tools, and services
  • Outdated or inaccurate documentation for frameworks and libraries.
  • Supporting multiple frameworks in the same code base.
  • Keeping up with a large number of new and existing tools or frameworks.
  • Using web technologies in a native or hybrid context (e.g, using WebViews, Electron, CEF)
  • Integrating with third parties for authentication.
Web capabilities
  • Lack of device APIs allowing for access to hardware.
  • Capability of the web to support a specified layout.
  • Achieving visual precision on stylized elements (e.g., buttons).
  • Getting users to grant permissions to Web APIs (e.g., geo-location).
Compliance with laws, regulations, and best practices
  • Managing user data to comply with laws and regulations.
  • Working with different tracking protection and data storage policies in browsers.
  • Understanding and implementing security measures.
  • Pinpointing existing performance issues.
  • Implementing performance optimizations.
  • Implementing localization.
  • Making sites accessible.

@captainbrosset
Copy link
Author

Some notes from the WebDX CG meeting today, in no particular order:

  • What do we expect to do with the data? How would we exploit it?
  • Running multiple MDN short surveys introduces the risk of not being able to compare things to each other.
  • Running each category after the other means we'd need a long time to go through all 6 of them. Could we run multiple surveys in parallel?
  • Ideally, we'd do the MDN DNA survey again, but as a shorter version, focusing only on the needs ranking.
  • Adding a free-form text question to capture new needs is good, but requires to get a lot of responses for anything useful to come through.
  • The needs from the DNA survey may need to be renewed, for example: "Having to support specific browsers (e.g., IE11).".

As an action item for next steps, I volunteered to take a look at the questions from the DNA survey and seeing how we could trim it down to a shorter version. @Rumyra, let me know if you have access to it.

@robnyman
Copy link
Member

robnyman commented Dec 6, 2022

@captainbrosset

The needs from the DNA survey may need to be renewed, for example: "Having to support specific browsers (e.g., IE11).".

I think that's worth discussing. In the case of IE11 specifically, even if it's reached end of support, I think it's still interesting to either get a confirmation that developers have now dropped their support, or that they still need to support it for the foreseeable future due to slow update rate among users.

@Rumyra
Copy link

Rumyra commented Dec 8, 2022

@captainbrosset - I've had a quick look into it - we still have the original website with results running https://insights.developer.mozilla.org/ and albeit archived the repo powering that https://github.com/mdn/insights

I haven't been able to find the results in Alchemer - I'm enquiring as I expect it wasn't called 'DNA survey' when created (@atopal do you have any insight into that?).

However the pdfs are available in the repo https://github.com/mdn/insights/tree/main/public/reports/pdf which could still prove helpful

@captainbrosset
Copy link
Author

Thank you @Rumyra. I looked into the reports in more details, and even if we don't have the survey questions, I think it's quite easy to imagine what a trimmed down version of the DNA survey would be: the maxdiff ranking of the 28 needs.

The DNA survey had more questions around technologies, and what's missing from the web, which we could remove to create a shorter survey.

The DNA survey displayed sixteen sets of five need statements, ensuring that each of the 28 needs was seen ~3x by each respondent. For each set, respondents were told to pick the one need that caused the least frustration and the one need that caused them the most frustration.

I don't know if these sets were created randomly for each person taking the survey, or designed beforehand.

Also of interest: 79.6% of respondents agreed (or strongly agreed) that the list of needs was a fair representation of the needs they experience as a web developer. But, 13.4% neither agreed nor disagreed which means there is room for improvement in the needs list.

Finally, it's worth noting that a sophisticated analysis of the MaxDiff data was done for the DNA survey, and to extract proper data from a new survey, we'd need to do the same thing.

@captainbrosset
Copy link
Author

In the event that we want to try and run a simplified/trimmed down version as a MDN short survey, I took a stab at coming up with a new list of needs:

  • Having to support specific browsers.
  • Avoiding or removing a feature that doesn’t work across browsers.
  • Making a design look/work the same across browsers.
  • Achieving visual precision on stylized elements (e.g., buttons).
  • Lack of capabilities to implement specific use cases.
  • Making web sites/applications accessible.
  • Making web sites/applications performant.
  • Making web sites/applications secure.
  • Using web technologies in a native or hybrid context (e.g, using WebViews, Electron, CEF).
  • Working with different tracking protection and data storage policies in browsers.
  • Outdated documentation for HTML, CSS and JavaScript.
  • Knowing what browsers support a specific technology.
  • Keeping up with changes to the web platform.
  • Testing across browsers.
  • Running automated tests.
  • Pinpointing existing performance issues.
  • Determining the root cause of a bug.
  • Keeping up with a large number of new and existing tools or frameworks.
  • Outdated or inaccurate documentation for frameworks and libraries.
  • Supporting multiple frameworks in the same code base.

This is largely inspired by the DNA survey needs, but with a few changes:

  • I rephrased a few needs that were too specific or harder to understand.
  • I removed the three least frustrating needs from 2020.
  • I removed a few other needs that didn't seem like knowing whether they were frustrating would benefit anyone in the ecosystem. And removed others that were too specific.

This list has 20 needs. If we wanted to maxDiff it, we could run it as 10 sets of 6 needs in order to ensure each need is seen 3 times.

@captainbrosset
Copy link
Author

Perhaps I should have googled this topic earlier... I only just stumbled upon @PaulKinlan's quarterly web pain point survey, which seems to fill this gap nicely.

Feels to me like we shouldn't unnecessarily duplicate efforts here.

@jgraham
Copy link

jgraham commented Feb 17, 2023

Just to record some of my thoughts around this somewhere:

  • I think that having a general high level survey that we run infrequently (e.g. one / two times a year) would be great.
  • I think that the DNA survey has a couple of problems we should be wary of repeating:
    • Difficult to complete: we added too many other questions, but also maxdiff seems to be really high overhead, compared to a csat style approach ("How satisified are you with ")
    • The needs are often ambiguous in problematic ways e.g. I never understood whether the results of "Having to support specific browsers (e.g. IE11)" were really about IE still being a requirement for some authors or more generally about a lack of interop between modern browsers.

So I'm more in favour of working from the ground up, based on specific questions we want to be able to answer. For example one thing that we are interested in (and others also seem to want to know about) is "which versions of browsers are developers actually supporting, and what process do they go through to support it". Previous surveys have had a "which browsers do you support" question and a "how far back do you support browsers" question, but they haven't allowed the answers to be coupled. It seems like a better design here might be along the lines of "Think of a site you worked on in the last six months. For this site, which versions of the following browsers did you have to support" and then specific options for major browsers (e.g. unsupported, current release only, back to ESR, specific versions). Then for that specific project / set of browsers you could ask about how they go about testing for support (to get information on manual vs automated tests and choice of tooling), and ask about problems that they ran into with that browser on that project.

Obviously for other topics like general concerns about the web stack and surrounding tooling we might need different questions. But I also think we already get more data about that from surveys like State of CSS.

@captainbrosset
Copy link
Author

See #20. Google has agreed to share their "devSat" survey questions. The very first satisfaction questions are very similar to the Web DNA questions, but have obviously been updated too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants