When Google announced in the Google I/O conference about the release of Lighthouse 3.0, many are wondering about the features of this newest edition.
In early 2018, Google released an SEO tool they call the Lighthouse. It suggests insights on how online marketers can improve the quality of the web pages. As an open-source automated tool, Google Lighthouse checks the page’s performance, accessibility and more.
Understanding How to Use Lighthouse
Using Lighthouse to audit web pages is very simple if you know how to use Chrome DevTools. All you need to do is to browse to the page with Chrome, open DevTools and then proceed to the ‘Audits’ section. By clicking the ‘Perform and audit’ will allow you to configure the audit level in accordance with your interests like performance, SEO, accessibility and more.
You will also able to see the loading and reloading of the pages which after a while will display a new window, which is the audit report.
After Lighthouse is done with the evaluation of your page, you will be provided with an audit report, which has several scores depending on the categories you have chosen when you did the audit configuration.
The Performance Score is computed from the speed test results, by comparing the speed of your website with others. If you get 100 percent that means that your tested website is faster than 98 percent or more of web pages. A score of 50 means that the page is faster than 75 percent of the web.
The other scores depend on the compliance of the page against related best practices. Although there is a ‘Best Practices’ category on your parameters, the term we use is misleading of course, as other categories are also offering best practices. The term ‘Miscellaneous’ will be more appropriate.
If you see a question mark appearing on the test result, it is more of an ‘Error’ meaning that you got this behavior when some related tests are conducted properly.
Following the scores, overview is the performance results on six metrics including tooltips for an explanation.
- First ContentFul Paint: First ContentFul Paint marks the time at which the first text/image is painted.
- First Meaningful Paint: First Meaningful Paint measures when the primary content of a page is visibly populated.
- Speed Index: Speed Index shows how quickly the contents of a page is visibly populated.
- First CPU Idle: First CPU Idle marks the first time at which the page’s main thread is quite enough to handle input.
- Time to Interactive: Interactive marks the time at which the page is fully interactive.
- Estimated Input Latency: the score is an estimate of how long the app takes to respond to user input, in milliseconds, during the busiest 5s window of page load. If the latency is higher than 50 ms, users may perceive your app as laggy.
Also in the report, you will find a filmstrip, which is a step by step images of the page loading. This is very useful to make sure that the pages have loaded as expected. If there are reports with discrepancies, you can confirm them using the filmstrip. However, this feature is just in its infancy. As of now, you will not be able to find out more about what is wrong. This will not be helpful when you are using Lighthouse for complicated works. If you do not have access to the page load waterfall, you will not figure out more deeply of what happened.
Following the performance review, what you will be provided is the best practices for each of the category. Although most of these tips are very technical and not very detailed in itself you will find out more when you click on the ‘Learn more’ links.
The numerous automated controls of Lighthouse is the reason why this is great audit tool. It also highlights some ‘Additional items to manually check’ which are precious reminders.
What do Lighthouse 3.0 brings to the table
The updated version of Lighthouse, which is very noticeable is the refreshed UI. The report still has the circle graphs to display the score for each of the five categories, the design is cleaner and sleeker. The metrics are also more readable, the green and orange circles located near each metric denotes whether each of them is at par for each audit criteria.
Scoring Weights for Performance Audit
In the previous version, the web page audited receives a score between 1-100 for the Performance Audit. This remains in this new version but Google Web has shifted the percentile that each score warrants. Previously, a score of 100 percent represents the top 5 percent of web pages, in Lighthouse 3.0 it represents the top 2 percent. As a consequence, the different audits that fall under Performance category have changed.
There are new audits added to this new version including the First ContentFul Paint, which is the amount of time it takes for text or image content to populate on the screen. A test on whether your website’s ‘robot.txt’ file is properly formatted. For assurance that GIFs are replaced with video tags and more are also added.
There are two audits have undergone a makeover in V3, and they are refreshed and renamed too. The First Interactive is changed to First CPU Idle, it bores the purpose of same-to measure when users first interact with your web page.
The second change is the Perceptual Speed Index which is now shortened to Speed Index. According to Google, the purpose of this audit is still the same as in V2. Only slight changes in the underlying metric were made.
With the simulated throttling via an internal auditing engine code named ‘Lantern’ has made possible faster audits with less variance in between runs. The engine runs audits under normal network and CPU settings and estimates how long a page would take to load in a mobile environment.
The progressive approach that Google Web team has made on the improvement of the Lighthouse audit emphasizes its importance in ensuring a fast, high-performing we experience on every device. Tests like Google Lighthouse Audit is making it easier for developers to ensure that they are providing the users a consistent, high standard experiences.