Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inject a script into every page that records userland performance metrics #2

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

josephwynn-sc
Copy link
Member

I distrust wptagent's method of extracting performance metrics via DevTools events. They're a bit messy and we've had bugs in the past where the events are attributed to the wrong frame. This PR is my attempt at switching to a more "standard" way of gathering performance metrics, which is to use the userland JavaScript APIs.

What this PR does is inject a script into every page that creates a PerformanceObserver and collects a bunch of metrics into a global variable.

Upsides

  • The values are exactly what tools like LUX or Lighthouse would report.
  • The code is way easier to understand than the DevTools event processing.

Downsides

  • The test result data size is increased. In fact, I've opted not to include layout shifts in this PR because they create a lot of data.
  • It's a bit trickier to do things like make the metrics relative to the beginning of a script step, as opposed to relative to navigationStart.

@andydavies
Copy link
Contributor

andydavies commented Jul 22, 2021

It's an interesting approach, and one that’s worth experimenting with…

A few initial thoughts:

  • Think user land approach will run into cross origin / frame limitations so may not capture the metrics of 3rd parties in iframes / workers (need to refresh my knowledge of actual limitations)

  • I'd add a timestamp to record when the inserted script executes (injectScript - uses Runtime.evaluate - can be flakey in WPT not sure if this will suffer the same issue but adding a timestamp was really helpful in spotting the failures)

  • would be interesting to expose this in scripting as it could be used to set localStorage & document.cookie so tests could bypass Quantcast Choice and other CMPs that don't rely on cookies alone

  • have you got examples where you think Chrome et al has got it's attributions wrong — as Lighthouse etc relies on it being right I'd expect it to get more accurate over time

  • What size is the CLS data - I think understanding what's moving is hard for a lot of sites

});
});

observer.observe({ type: "longtask", buffered: true });
Copy link
Contributor

@andydavies andydavies Jul 26, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

List of types looks OK to me. From a synthetic PoV can't see anymore from this list I'd want to add https://www.w3.org/TR/timing-entrytypes-registry/

One challenge with this approach is (I think) it'll only give limited visibility into what's occurring in other frames / workers

observer.observe({ type: "element", buffered: true });
observer.observe({ type: "paint", buffered: true });

// Disabled layout shifts for now, since the resulting entries are potentially
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How large is the layout shift data - CLS is one of the things I think people struggle with most?

@@ -241,6 +243,10 @@ def start_recording(self):
self.send_command('Debugger.enable', {})
self.send_command('ServiceWorker.enable', {})
self.enable_target()
performance_metrics_script = os.path.join(self.script_dir, 'performance_metrics.js')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to hard code this script into the agent, or should we use the same approach as custom metrics where it's passed from the server as a base64 encoded string?

Guess the question is how much work is it to deploy the agents?

performance_metrics_script = os.path.join(self.script_dir, 'performance_metrics.js')
if os.path.isfile(performance_metrics_script):
with io.open(performance_metrics_script, 'r', encoding='utf-8') as script_file:
self.send_command('Page.addScriptToEvaluateOnNewDocument', {'source': script_file.read()})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reading the docs for Page.addScriptToEvaluateOnNewDocument it appears the script will get evaluated in every frame that gets created?

https://chromedevtools.github.io/devtools-protocol/tot/Page/#method-addScriptToEvaluateOnNewDocument

@josephwynn-sc josephwynn-sc removed the request for review from zeman November 15, 2021 20:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants