Recently a store I managed starting failing its web core vitals on CLS score. The pages were group into two types, Product pages and Category pages. A quick check using chrome and lighthouse I could see that the pages actually had really good CLS for the desktop view.
Using a tool like screaming frog I could fetch the page speed scores for these problem page types. This confirmed google was hold very different CLS scores compared to real scores I get from running lighthouse in my own browser. One thing I noticed is that everytime you run a lighthouse report it sends the data to google to use in its scoring. So after a few days the pages I checked in lighthouse suddenly look ok in web core vitals.
So the goal now was to run lighthouse report for performance against all 3700 products in the store. But this would take forever manually. So i looked into running the report in batch using chrome from the CLI.
Here is a guide on how to achieve this using a linux device. In this guide I use a debian version ie, ubuntu, mint, zorin
Install NPM and NODEJS
sudo apt update sudo apt install nodejs npm
Install Latest version of NODEJS as Lighthouse needs this
link : https://github.com/nodesource/distributions/blob/master/README.md#debinstall
# Using Ubuntu curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash - sudo apt-get install -y nodejs
Install Lighthouse
sudo npm install -g auto-lighthouse
Using lighthouse
#perform a lighthouse report product a html report in the directory you are in lighthouse https://www.domain.co.uk/page.html #perform a lighthouse report and open it in a browser once it is done lighthouse https://www.domain.co.uk/page.html
Performing a desktop report
The above example will perform a mobile render by default. If you to perform a desktop report then you will need to create a config.js like this.
module.exports = { extends: 'lighthouse:default', settings: { formFactor: 'desktop', throttling: { rttMs: 40, throughputKbps: 10240, cpuSlowdownMultiplier: 1, requestLatencyMs: 0, downloadThroughputKbps: 0, uploadThroughputKbps: 0 }, screenEmulation: { mobile: false, width: 1350, height: 940, deviceScaleFactor: 1, disabled: false }, emulatedUserAgent: 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4143.7 Safari/537.36 Chrome-Lighthouse' } }
This will allow you to create a desktop report and place the completed report in a different folder.
lighthouse https://www.domain.co.uk/page.html --config-path ./config.js --output-path=./reports/page.html
After this I converted the sites xml sitemap to CSV and the created a spreadsheet using =concat() function to build a batch.sh file which could be run from the terminal. Even with 3700 urls to process it still took my laptop around 14hours to complete the task. But in this case I did not have to press a button.