Content Delivery Network (CDN) is a service that delivers static web pages from data centers around the globe. CDNs are often used to speed up websites and reduce bandwidth costs.
Content Delivery Networks (CDNs) are a type of distributed computing system that provides Internet services such as caching, load balancing, and high availability. They are designed to deliver content at high speeds and low cost.
A CDN is a network of servers located across multiple locations. These servers store copies of frequently accessed files, such as images, video, or JavaScript code, and serve them back to visitors. The goal is to improve performance and reduce latency.
CDNs can be managed by third parties who may have access to information about your customers. This makes it easier for CDNs to target ads at you based on what they know about your users. It also means that CDNs can track user activity in order to build profiles of individual users’ interests.
The following diagram illustrates an example of how a CDN works:
A CDN server receives requests for web objects (e.g., HTML documents), which are then cached locally. When a request comes in, the CDN server checks if there is a local copy available. If so, it returns the object directly to the client; otherwise, it forwards the request to another CDN server, which stores the requested object.
In addition to providing faster delivery of content, CDNs provide other benefits including:
Reducing the number of HTTP requests made from clients to servers
Reducing the amount of time spent waiting for content to download
Minimizing the time between when a visitor clicks a link and when the page with the linked content loads
Reducing Bandwidth Costs
A CDN reduces bandwidth usage because most of the traffic will be served from nearby servers. In this way, CDNs allow companies to use less bandwidth without sacrificing performance.
Another benefit of using a CDN is that it allows website owners to avoid paying for bandwidth consumed by their own users. Instead, CDNs charge end-users for accessing their contents.
CDNs are not free, however. Companies must pay for the infrastructure needed to maintain a CDN. Depending on the size of the company, these costs could range anywhere from $1,000/month to hundreds of thousands per month.
Who is using CDN?
As of 2015, Amazon Web Services (AWS) was the largest provider of cloud services. AWS has been growing rapidly since its inception in 2006. As of September 2016, AWS had over 100 million active customer accounts.
Other large providers include Google Cloud Platform, Microsoft Azure, IBM Softlayer, Rackspace Hosting, and Alibaba Cloud. However, there are many smaller CDN providers as well. For instance, Akamai Technologies Inc. offers both private and public CDNs.
Why Use CDN?
There are several reasons why you might want to use a CDN. Here are some common ones:
Speed Up Websites
When a browser downloads a resource, it sends a “request” to the server hosting the resource. The server responds by sending a file containing the requested resource. While the request travels through the Internet, it passes through various routers and firewalls. Each router and firewall adds overhead to the process.
When a CDN is used, the request goes straight to the closest CDN server, bypassing all of the routers and firewalls along the way. Because the request is sent directly to the nearest CDN server, it arrives much more quickly than one traveling through the normal path.
This speed-up can make a big difference in page load times. According to a study conducted by Pingdom, average page load speeds increased by up to 20% after switching to a CDN.
Save Money
Because CDNs reduce the distance traveled by data, they may also save money. Data traveling via a long route may cost more than data traveling via a shorter route.
For example, consider two websites hosted on different continents. One site is located in Europe while the other is located in Asia. Suppose each site uses an expensive connection such as a T1 line. Let’s say that the total cost of connecting the sites is $10,000.
If the sites were connected directly, the cost would be $10,000. But suppose the sites connect via a third party service that connects them at a lower cost. This service charges $5,000 for the connection. The total cost of connecting the two sites is now $15,000.
However, if the sites connect via a CDN, then the cost drops dramatically. The CDN connects the sites at a cost of only $2,500. Therefore, the total cost of connecting those two sites is now just $13,500. That’s a savings of $2,500!
Reduce Bandwidth Usage
CDNs help to reduce bandwidth usage because they serve resources from close servers instead of distant servers. When a user requests a web page, the user’s computer first sends a request to the server hosting the webpage. The server returns a file containing the requested content.
Let’s assume that this request takes 10 seconds to complete. In addition, let’s assume that the entire website contains 1 MB of content. If the web page is served locally, then the user must download the whole 1 MB of content before he or she can view it.
Now imagine that the same website is hosted on a CDN. The CDN serves the page from a nearby server rather than the original server. So the request doesn’t have to travel across the Internet. Instead, it can go directly to the closest CDN.
The result is that the user receives his or her page faster. As a result, less time is spent downloading the page. Less time means less bandwidth usage.
Increase Page Load Speed
Another benefit of using a CDN is that it helps to increase page load speed. A CDN caches files so that users don’t have to wait for them to arrive from their origin server.
The end result? Users get their pages faster. They spend less time waiting for their pages to load.
Improve SEO
A CDN can improve search engine optimization (SEO) by reducing the number of links needed to reach your website. Caching makes your website easier to find when people type in keywords related to your business.
With a CDN, you don’t need to create hundreds of links to your own website. You simply add one link to the CDN and all of its cached copies are automatically updated.
In fact, Google has even acknowledged the benefits of caching: “We recommend that you use HTTP/1.1 ‘Vary’ header fields with Cache-Control directives to avoid sending identical responses to multiple URLs.”
So why not start using a CDN today? It will pay off in the future.
By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.
By submitting your personal information, you agree to receive emails regarding relevant products and special offers from TechTarget and its partners. You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.
Google AdSenseGuest Advertisement
I’ve been looking into setting up a CDN for our company. We have a lot of static content but we also host videos on YouTube. Does anyone know how well these work together? Do you think it would be worth it to set up a CDN for static content and then proxy through another CDN for video?
You could do both, yes. There isn’t anything inherently wrong with doing that. However, what happens if you want to change the URL of a resource? You’d have to update every single copy of the file in the CDN cache. That can take some time. On top of that, you’d have to manually update any other resources that refer to those files.
What about performance issues? Would you see a difference between serving a video from an internal network versus a CDN?
There is no magic bullet here. Every situation is different. What works best depends on many factors.
For example, there are two main types of CDNs: Content Delivery Networks (CDNs) and Application Delivery Networks (ADNs). An ADN provides services such as application acceleration, security, and high availability. An ADN might include proxies, firewalls, load balancers, etc. A CDN only serves files.
If you’re hosting videos on YouTube, you’ll probably want to look at an ADN. If you’re just storing images or text, a CDN will likely suffice.
Now, let’s talk about performance. There are three things to consider:
• The distance between your servers and your clients
• Bandwidth
• Latency
Distance between your servers and your client devices greatly affects performance. Let’s say that you’re running a web site on Amazon EC2 instances located in North Virginia. Your clients are located in San Francisco. To serve a request, your server needs to send data over the Internet to San Francisco.
Latency is the amount of time it takes for a response to arrive after a request is made. Latency is measured in milliseconds, so 1 millisecond is equal to 1ms. Latency is affected by bandwidth and distance.
Bandwidth refers to the speed of your connection. If you’re connecting via DSL, your upload speed is limited to 56kbps. If you’re connecting over cable modem, you might get 50mbps down and 20mbps up.
The more bandwidth you have available, the faster your connection can transmit data. If you have very little bandwidth, you won’t be able to download large files quickly. For example, suppose you have 10MB of data that you need to transfer to someone. It will take 10 seconds to complete. But if you had 100Mbps of bandwidth, it would take 0.1 second.
The third factor affecting performance is latency. As mentioned earlier, latency is the amount of time required for a response to reach a client device. Most people assume that longer distances require longer latencies. In reality, however, the opposite is true. Longer latencies require shorter distances. So if you’re sending a request to a client located in San Francisco, and the request travels across the Pacific Ocean, it will take longer than if you were requesting a page from a client in New York City.
So how does all this affect us? Well, let’s say we have a website that hosts videos. We also have our own CDN that stores copies of the videos in various locations around the world. Now, let’s say that one of our viewers wants to watch a specific video, but they live in New York City. They make a request for the video to our CDN, which then sends the request to our server in North Virginia.
Our server responds with a 200 OK status code and the requested media file. However, because of the distance between our client and our server, the response takes longer to travel than if we were closer together. This means that the viewer has to wait longer before they can view their video.
Of course, there are other factors that could delay the delivery of a response. For instance, if we had multiple requests coming in simultaneously, our server may not be able to respond to every single request immediately. And even if we did, some requests might still fail due to network problems.
In order to ensure that our viewers always receive a timely response, we need to keep our server as close to them as possible. A good place to do this would be within their local region. That way, when they make a request, the response will come back much quicker.
Conclusion
Latency is an important part of designing a high-performance website or web application. It affects many different aspects of a site. From user experience to SEO to scalability, latency plays a role in almost everything that happens on the World Wide Web.
0 Comments