AutoWebPerf
AutoWebPerf provides a flexible and scalable framework for running web performance audits with arbitrary audit tools including PageSpeedInsights, WebPageTest and more.
Install / Use
/learn @GoogleChromeLabs/AutoWebPerfREADME
AutoWebPerf (AWP)
<p align="left"> <img src="https://i.imgur.com/f87A9xi.png" width="200" alt="quicklink"> </p>AutoWebPerf provides a flexible and scalable framework for running web performance audits with arbitrary audit tools like WebPageTest and PageSpeedInsights. This library enables developers to collect metrics consistently and store metrics to a preferred data store such as local JSON files, Google Sheets, BigQuery, or an in-house SQL database.
Check out https://web.dev/autowebperf for introduction.
How it works
AutoWebPerf takes a list of Tests from an arbitrary data store platform, such as local JSONs, Google Sheets, BigQuery, or a self-hosted SQL database. With the list of Tests, it executes audits based on each Test config, collects metrics from individual data sources into a list of Results.
The process of running an audit through an measurement tool (e.g. WebPageTest) is defined in the individual Gatherer. The logic of reading and writing with a data platform (e.g. local JSON) is implemented in a Connector.
Feature highlights
- A library of web audit automation that can be plugged-in to any platforms, like Google Sheets, GCP App Engine, or simply a cron job that writes to JSON file.
- Providing the ability to run recurring tests with customizable frequency (e.g. daily, weekly, monthly, etc), network conditions, and other audit configs, etc.
- Metric gatherers are designed as modules that are decoupled with the output data format and automation logic.
- Connector modules are designed to read Test list and write audit results to
specific data format or platforms. e.g. a connector for CSV files.
(See
src/connectors/csv-connectorfor details)
How does this compare to the rest of Google's speed measurement tools?
AutoWebPerf serves as a performance audit aggregator that automates the process of performance audit and metrics collection through multiple speed measurement tools including WebPageTest, PageSpeedInsights, and Chrome UX Report. As each individual speed measurement tool provides audit metrics, AutoWebPerf aggregates the results and writes to any preferred data storage platform, such as local JSONs, cloud-based database, or GoogleSheets.
Quickstart
First, clone AWP repo and run npm install:
git clone https://github.com/GoogleChromeLabs/AutoWebPerf.git
npm install
Once finished, check the install by running a single test with the following command:
./awp run examples/tests.json output/results.json
This command uses the example file in examples/tests.json and returns the results to output/results.json.
To start recurring tests, you'll need to include a recurring.frequency property in the test file and set the next trigger in the test file. To setup the next trigger time and to run a one-off test, use this command after adding the recurring.frequency property to your tests:
./awp recurring examples/tests-recurring.json output/results.json
If this was successful, the trigger time will have updated base on your chosen frequency, and a result would have been written to output/results.json.
Once the trigger time is correctly set, you can have your tests auto-run on the next triger time with the continue command:
./awp continue examples/tests-recurring.json output/results.json
This will automatically run each test at the frequency specified. More information can be found below in the "Run recurring tests" section below.
More Examples
Single URL: To test a single URL through PageSpeedInsights:
./awp run url:https://www.thinkwithgoogle.com/ json:output/results.json
Pick Gatherer: to test a single URL with a specific gatherer like PageSpeedInsights or WebPageTest:
./awp run --gatherers=psi url:https://web.dev json:output/results.json
CSV file: To run tests defined in a CSV file and write results to a JSON file:
./awp run csv:examples/tests.csv json:output/results.json
PageSpeedInsights API: To run PageSpeedInsights tests with an API Key:
PSI_APIKEY=SAMPLE_KEY ./awp run examples/tests.json output/results.json
WebPageTest API: To run WebPageTest tests:
WPT_APIKEY=SAMPLE_KEY ./awp run examples/tests-wpt.json output/results.json
Override vs. Append: To run tests and override existing results in the output file
./awp run examples/tests.json output/results.json --override-results
Available gatherers:
- WebPageTest - See docs/webpagetest.md for details.
- PageSpeed Insights - See docs/psi.md for details.
- Chrome UX Report API - See docs/cruxapi.md for details.
- Chrome UX Report BigQuery - See docs/cruxbigquery.md for details.
Available connectors:
- JSON connector - reads or writes to local JSON files. This is the default connector if a conenctor name is not specified. For example:
./awp run examples/tests.json output/results.json
Alternatively, to specify using the JSON connector for the Tests path and the Results path:
./awp run json:/examples/tests.json json:output/results.json
- CSV connector - reads or writes to local CSV files.
To specify using the CSV connector for the
Testspath and theResultspath:
./awp run csv:/examples/tests.csv csv:output/results.csv
- URL connector - generates just one
Testwith a specific URL for audit. To run an audit with just oneTestwith a specific URL:
./awp run url:https://example.com csv:output/results.csv
Please note that this connector only works with Tests path, not for the Results path.
- Google Sheets connector See docs/sheets-connector.md for detailed guidance.
Using AWP with Node CLI
Run tests
You can run the following anytime for printing CLI usages:
./awp --help
To run tests, you can run the following CLI command with given Tests JSON, like
examples/tests.json, which contains an array of tests. You can check out the
examples/tests.json for the data structure of Tests.
./awp run examples/tests.json output/results.json
This will generate the result object(s) in the given path to results.json.
By default, AWP will use JSON as the default connector for both reading tests
and writing results. Alternatively, you can specify a different connector in the
format of <connector>:<path>.
E.g. to run tests defined in CSV and write results in JSON:
./awp run csv:examples/tests.csv json:output/results.csv
Retrieve test results
For some audit platforms like WebPageTest, each test may take a few minutes to fetch actual results. For these type of asynchronous audits, each Result will stay in "Submitted" status. You will need to explicitly retrieve results later.
Run the following to retrieve the final metrics of results in the
results.json.
./awp retrieve examples/tests.json output/results.json
This will fetch metrics for all audit platforms and update to the Result object
in the output/results.json. You can check out examples/results.json for
details in Result objects.
Run recurring tests
If you'd like to set up recurring tests, you can define the recurring object
that contains frequency for that Test.
./awp recurring examples/tests-recurring.json output/results.json
This will generate the Result object in the results.json and updates the next
trigger time to its original Test object in the tests.json. E.g. the updated
Test object would look like the following, with the updated nextTriggerTimestamp.
{
"label": "web.dev",
"url": "https://web.dev",
"recurring": {
"frequency": "Daily",
"nextTriggerTimestamp": 1599692305567,
"activatedFrequency": "Daily"
},
"psi": {
"settings": {
"locale": "en-GB",
"strategy": "mobile"
}
}
}
The nextTriggerTimestamp will be updated to the next day based on the previous
timestamp. This is to prevent repeated runs with the same Test and to guarantee
that this Test is executed only once per day.
Set up a cron job to run recurring tests
In most Unix-like operating system, you can set up a cron job to run the AWP CLI periodically.
For example, in macOS, you can run the following commands to set up a daily cron job with AWP:
# Edit the cronjob with a text editor.
EDITOR=nano crontab -e
Add the following line to the crontab for a daily run at 12:00 at noon. Note that this is based on the system time where it runs AWP.
0 12 * * * PSI_APIKEY=SAMPLE_KEY cd ~/workspace/awp && ./awp run examples/tests.json csv:output/results-recurring.csv
Run tests with extensions
An extension is a module to assist AWP to run tests with additional process and
computation. For example, budgets extension is able to add performance budgets
and compute the delta between the targets and the result metrics.
To run with extensions:
./awp run examples/tests.json output/results.json --extensions=budgets
Tests and Results
Define the Tests
The list of tests is simply an array of Tests objects, like the sample Tests
below. Or check out src/examples/tests.js for a detailed example of Tests
list.
[{
"label": "Test-1",
"url": "example1.com",
"webpagetest": {
...
}
}, {
"label": "Test-2",
"url": "example2.com",
"psi": {
...
}
}]
Each Test object defines which audits to run by defining gatherers property.
For example, the first Test has a webpagetest property which defines the
configuration of running a WebPageTest audit. The second Test has a psi
property that defines how to run PageSpeedInsights audit.
Generate the Results
After running tests, a list of Results is generated like below. Each Result
contains its corresponding metrics to the predefined gatherers such as
WebPageTest and PageSpeedInsights
