How to speed up web performance ?

First of all it’s need to metabolize the fact that the term mobile responsive is not only related to the mobile media screen size but also and above all to 3G/4G bandwidth, the bottleneck.

In the first analysis it is estimated the number of potential users and geolocation.

Based on this it is advisable to rely on the power and flexibility of cloud computing.

Once the load is established, the decision is made if it is necessary to scale only vertically or vertically and horizontally, the next step is to design the cloud architecture.

Unfortunately cloud computing is not enough on its own to speed up the performance.

It is necessary to make the web project light and fast as if it were a native mobile app.

Through cloud computing it is possible to scale vertically and horizontally and achieve excellent performance within the cloud network but on a 4G network, if the web project is not optimized, it will be always slow.

It’s possible to measure latency with stackdriver, for example, but this metric measures the performance inside the cloud network, from the server’s regional location to the cloud network node closest to us.

To measure the latency (page loading speed) to the mobile device on 4G network, it’s need using tools like lighthouse.

About Lighthouse PageSpeed Insights

The bottleneck (3G/4G network) is right between the last node and the device.

What about the PWAs?

PWAs, Progressive Web Apps, although useful because they allow offline browsing, from the point of view of the perfomance, are nothing more than another browser cache. Are a good compromise if the user has already visited the page before and has to reload it but if the user connects for the first time, the PWAs do not solve the performance problem.

Someone chooses to have a version of the web project for desktop and another for mobile instead having only one well optimized version for mobile-first and desktop, this, from the SEO point of view, is not considered a best practice. If the web content on mobile is not identical to the desktop content, Google could ban the website. Read more: Prepare for mobile-first indexing.

Someone else realizes native apps for Android and iOS of the website but, in the latter case, there is the problem of convincing users to download a native app of the web project.

If the app has an effective utility it is likely that users will download it, otherwise not and often there may be also a limit to the number of native apps a user can have on his device, e.g. low-end mobile device.

Therefore is necessary to reprogram and optimize the chosen CMS (e.g. WordPress, Drupal, Magento), with the advantage of avoiding additional costs for developing apps for Android and iOS when these are not actually necessary.

This, in addition to the optimization and configuration of the vm instance, is the most substantial work as many plugins cannot be used and everything is built by hand, back-end and front-end programming.

As far as plugins are concerned it is good practice to do a test in a demo version on localhost first. Plugins are generic solutions for non-developers, there are many, some are of good quality others would be well avoided like bubonic plague.

A metaphor relating to the use of plugins for WordPress to the detriment of the programming of a professional developer, of this strange and misunderstood sweatshirt and hoodie subject, could be that of the difference when one goes out to eat in a fast-food restaurant unlike when one cooks at home. When you cook at home you know what you eat.

One last tip, about the articles found around on the web, on how to speed up web performance with few magic plugins, is to make a speed test of the author’s web page who supports this, too simplistic, solution.

To be credible, words, what is written, should always be supported by concrete facts and working examples.

With regards to WordPress themes, WordPress is the most popular CMS, by default these come out, more or less, with semi-optimized performances. The knots will come to the comb when there is a need to add features to the theme, from the simple slide-show or image gallery to the more complex integrations with REST APIs. The skill lies in being able to integrate functionality without going to affect the speed performance.

One of the many aspects of web performance optimization, for example, is finding the right balance between inline javascript versus external javascript, which then determines the value of the first input delay, how quickly it takes for the server to send the first payload to the client, and this aspect only a developer’s analysis can establish, as javascript parsing, compiling and execution can slow down. Only this strange sweatshirt and hoodie subject can manipulate the code and reduce it to many small separate bits.

To speed up web performance, to make the idea, is like turning a utility car into a racing car and obviously times and costs for development are much higher than for plugin solutions.

In order to maintain the CMS’s performance in the future, it is necessary that the person in charge of publishing articles on the blog or new pages is trained with a view to delivering high-performance content and has a knowledge of front-end programming languages (HTML, CSS, JAVASCRIPT) using a classic html editor.

What about the 5G network ?

The estimate is that by the year 2025 only 14% of global connections should be 5G so we are just at the beginning of a very slow revolution.

Possible alternative solutions to 5G and insight ?

Interoperable Browser Cache

By 2020 many cities are likely to benefit from the new 5G infrastructure, which will feature high speed and very low latency. Smaller centers will continue to be connected via 3G/4G networks so this bottleneck will remain for a long time and when building something on the web it must be for almost everyone.

Without prejudice to all the WPO techniques for quickly loading resources, well, if in theory, it were possible to have on browsers, as a standard protocol, a permanent and interoperable cache of updatable common public resources, such as e.g. jQuery, google analytics, fonts, with a client function that checks if the resource is available from this cache, of rather than having to download it each time, in terms of performance there would be a huge profit and also lower costs, for both the end user and companies and and less CO2 production.

So not like a normal browser cache that works relatively on a DNS but an interoperable browser cache of common resources for any domain name server.

How many web content, with different DNS from each other, use, for example, the same resource e.g. jQuery? Why is it necessary to download this same resource from every website or from a CDN every time ?

If this were available in a permanent cache of browser, there would be no need to download it every time, so it is always the same resource.

If a previous or more updated version is needed, this permanent cache could be updated with multiple versions. In this way a common resource in this interoperable browser cache, downloaded from the website, will be reusable for the website, etc, without needing to be downloaded again in cache.

As far as the jQuery library is concerned, it is advisable, when possible, to remove it from the CMS frontend recompiling its dependencies, whose reasons are perfectly explained in this GitHub article: Removing jQuery from GitHub frontend.

In this regard there is also the irony created by Eric Wastl on the Vanilla js framework, which in fact is not a framework but a provocation to bring attention to the potential of native JavaScript and the evolutions of the DOM specifications in recent years.