Building faster applications requires knowledge of performance testing and troubleshooting in both test and production environments. The following recommendations will reduce web page load time for applications.
Content Delivery Network (CDN)
The root cause of web application performance problems is latency, and propagation delay is the primary cause of latency. The web page load time increases with physical distance from a web server. Caching your static and dynamic content on a local edge server virtually eliminates propagation delay (distance). The other variable is server processing delay that contributes to slower application performance as well.
For example, say you are located in Miami and you connect to an application at a data center in San Francisco or Seoul. Your initial connection to the CDN network would require a one-time content fetch from the Apache web server. Most CDN services such as Cloudflare can host both static and dynamic content with a specified refresh interval. Once your files are cached at an edge server in Miami the application would have the same delay as though it were local. There are also ancillary network services such as web application firewall, DNS, and data encryption that provide internet cyber security. You can work remotely from any location worldwide and get the same local response time.

Code Efficiency
As a web developer you can refine your code whether it is HTML or JavaScript for example to speed up your applications. This will also help during times when web server utilization is higher.
Reduce SQL Fetches: It is important to minimize turns between application server and database server. Refine your code to request the most data with fewest fetches. Enable the default packet size of 1500 bytes or larger within the data center to reduce application turns as well.
Asynchronous Loading JavaScript: This is recommended to enable simultaneous processing of JavaScript and HTML web page rendering. This eliminates render blocking since JavaScript (and CSS) are considered render blocking resources. You could for example download and render a large image for faster First Contentful Paint (FCP) user experience. The result is faster web page load time on your browser. This is accomplished with either ‘async’ or ‘defer’ attribute. The difference is async attribute enables JS processing while page rendering occurs. The defer attribute will download the script and wait until rendering completes before processing script.
<script async src=”https://www.google-analytics.com/analytics.js”></script>
<script defer src=”https://www.google-analytics.com/analytics.js”></script>
Code Splitting: The advantage of splitting your code into smaller chunks is faster file download to your browser. This is accomplished with on-demand download of resources based on user behavior and requests. Consider how code splitting will affect code testing complexity and weigh the advantages.
HTTP/3
The number of websites currently based on HTTP/3 is approximately 35% and growing. It is easily the successor to HTTP/1.1 and quickly replacing HTTP/2. It virtually eliminates head of line blocking with stream multiplexing and has an updated header compression method called QPACK. HTTP/3 is a paradigm shift from traditional TCP-based connections to UDP with all the advantages. The problem with TCP is it cannot be easily modified since it is fundamental to established network connections.
There is protocol overhead with TCP session handshakes and flow control that is not required with UDP transport. You can download multiple files simultaneously and out of order segments over a single UDP connection. In addition, connection migration makes switching between wired and mobile service seamless.
Older techniques such as concatenation and spriting are no longer required or recommended with HTTP/3. In fact, enabling HTTP/3 will allow simultaneous download of multiple image files without the protocol overhead of TCP transport.
HTTP and Service Worker Caching
All web developers should know how client-side and server-side caching strategies work along with the effects on performance and security. There is a balance to consider when deciding what content can be cached and for how long before refresh is required. This extends to CDN caching of static and dynamic content as well. Web application content that is cached locally also reduces web server utilization for resiliency and fast web page load time. Most server-side caching is based on reverse proxy servers and cloud caching solutions such as Redis and Memcached. Service worker caching has some advantages when compared with HTTP caching such as granular control over what content is cached and other capabilities.
Reverse Proxy Server
The primary reason for implementing any reverse proxy server is the performance and security advantages. It is deployed in front of web servers to enable server-side caching. There is also offload of SSL/TLS data encryption/decryption from web servers. The advantage is reduced frontend and backend server utilization that affects multiple web applications. Reverse proxy servers support offload of data compression as well from web servers.
HTML Optimization
The purpose of client-side HTML optimization is to reduce web page load time and improve perceived user experience. Among the most recommended techniques include lazy loading, prefetch, preconnect, and preload.
- Lazy Loading : Demand oriented page load strategy that starts above the fold
- Link Prefetch: Background content fetch during idle time after page rendering
- Preconnect: Browser preemptive connection to a different origin server
- Object Preload: Declared objects are preloaded and cached before page rendering

Webpack
The size of a single web page has steadily increased to average size of 2.2 Mbytes. Most of that bloat can be attributed to the number and size of image files. The result of larger image files is longer downloads that worsen when network congestion and latency increase. Performance takes a hit as well when there are a high number of image files that require TCP connections. Webpack is recommended to optimize how your files are loaded. This applies to JavaScript, image files and CSS files. It is really a development technique that bundles files and reduces file size for faster more efficient page load time.
Packet Size (MTU)
Most application vendors will recommend you do not change the global default packet size 1500 bytes (MTU) to any lower value. You will lower data throughput and immediately increase the number of turns required per transaction. The other problem is packet fragmentation at network devices since most network interfaces wont forward packets larger than 1500 bytes. That applies whether within any data center or across the internet. Ethernet jumbo frames can be implemented between all switches within the data center. This will increase throughput significantly for your full stack applications and works well for backup applications.
Transport Layer Sockets 1.3
HTTPS accounts for approximately 90% of all internet traffic since it provides a secure encrypted connection. This is enabled with Transport Layer Sockets (TLS) that is the successor to SSL protocol. TLS 1.3 is the most current version and well recommended with faster connection time and 0-RTT session resumption. This is an advantage when multiple connections to web servers exist and new TCP connections are opened. Each connection adds up to more latency and web page load time. There is also OCSP stapling (Online Certificate Status Protocol) feature that caches revocation status at an origin servers. This is a performance advantage since clients do not have to query the CA certificate provider and incur more network latency on a connection.

Public DNS Server
Every web developer should absolutely know how DNS works since it is fundamental to web applications. All connections to your application start with resolving an IP address from a URL hostname. There is also the network latency and DNS server processing delay that affects web page load time. This is particularly relevant to internet applications where performance can vary based on congestion. Cloudflare and Google have the fastest most reliable public DNS servers available. It is easy to configure your application for a public DNS provider such as Cloudflare and reduce page load time significantly.
