How We Test & Review Proxy Services

It’s no secret that Proxyway is an affiliate website. This is how we (Adam, Chris, and Isabel) sustain ourselves. At the same time, we run detailed performance benchmarks and strive to provide impartial evaluations of the services we review.

Can these seemingly conflicting interests be reconciled? And why should you trust us? This article thoroughly explains our testing methodology and how Proxyway balances the commercial and research sides of the project.

How Proxyway Makes Money

Proxyway has several sources of income:

  • Our primary source is affiliate relationships with proxy server providers and other companies we cover on this website. The details vary by provider, but we normally get some revenue share for each paying customer that registers via an affiliate link. The links appear in Best Provider lists, as well as individual reviews.

    This doesn’t mean that you pay more than you would by visiting the company directly – the price remains the same or even cheaper, as we manage to negotiate custom discounts and other deals. 

  • Unlike many other affiliate websites, we don’t let companies freely bid on positions in lists. Instead, we make the rankings based on performance benchmarks and our individual experience with the service. We then offer providers to claim their designated position based on these rankings. If they refuse, we offer the position to the next best provider.

  • We also make money by selling reviews to interested companies. We charge for the labor involved in making the review, and paying doesn’t guarantee a favorable evaluation. We describe the process in detail in the following section.

  • We have other sources of income, too, but they constitute a minor share of our earnings. For example, you may sometimes see a banner promoting an event or service. In addition, providers may impact their order of appearance in the Reviews navigation drop-down and the Reviews category page


Our
research initiatives are removed from the commercial side of the project. In other words, you can’t buy your way to better results or a more favorable description.

Our Review Process

Proxyway is a well-known outlet in its niche, so we get many review requests. Due to our workload and other activities, we can respond only to some of them. There aren’t any particular selection criteria, but we generally prefer services which: have a large market presence, have something unique to offer (such as their own proxy pool), or simply interest us. 

Our reviews cost money. This is to compensate for the labor invested in running performance tests and creating content. Paying the fee gives providers a spot on Proxyway, but it doesn’t guarantee a favorable evaluation. In particular, reviewees receive:

  • Detailed performance evaluation of one or several products. We cover the benchmarks below.
  • Appearance in our list of reviews. 
  • Appearance in our Best Provider lists based on the results.
  • A spot in our research initiatives (such as the annual Proxy Market Research), though having a review isn’t a hard requirement to participate.
  • Continuous coverage in our Proxy Market News section


Providers often give us access
to the services we test. This incurs the risk of cheating (such as supplementing their proxy network with those of other companies). But at the same time, it allows providers to show their full potential. In short, we work on the basis of a gentleman’s agreement. 

We show the draft to reviewees before publishing to verify factual information. However, we don’t accept editorial edits; and unless we make serious methodological errors, we rarely re-test a product during the reviewing process. As a result, we have few glowing reviews, but companies are generally satisfied with what they tend to call a balanced evaluation. In rare cases, reviewees have opted to remain unpublished.

Benchmarking Methodology

Detailed performance benchmarks are one of Proxyway’s hallmarks. This section explains how we run them. 

Proxy Pool Test

Our proxy pool test evaluates the size and composition of proxy networks. We run it on every product that takes the form of a rotating proxy pool: most often residential and mobile proxy networks, but also datacenter and ISP proxies if they meet the criterion.  

The benchmark uses our custom-made Python script that is publicly available on GitHub. We run up to 60,000 requests per day, depending on the proxy network’s location and type, rotating IPs with every connection request. Our longest benchmark, which tests a proxy pool without any filters, lasts 21 days. 

Some may argue that more requests should be made, but: 

  • Proxy providers usually prioritize IP uniqueness as a selection criterion.
  • Research papers and our experience show that after a certain point, proxies start repeating themselves.
  • Our data spans multiple years, and a consistent number enables easier comparison.


We get details about the IP address associated with a proxy server from two leading databases: IP2Location and MaxMind. 

  • MaxMind gives us geolocation and ISP-related data, such as the country and city the IP is in, or which internet service provider it’s related to. 
  • IP2Location informs us about the usage type of the IP: in other words, what type of service the company which owns the IP address offers. That could be landline or mobile internet providers, data centers, hosting services, etc. It helps us decide whether the proxy belongs to a residential network (ISP, ISP/MOB, or MOB usage type), or if it’s a mobile address (MOB or ISP/MOB usage type). 


Below are our testing parameters for different proxy networks. They may change over time:

Proxy locationDaysTotal requests
Global211.2 million
EU (DE, ES, FR, IT, NL)141.2 million
US14560,000
UK14560,000
India14560,000
Brazil14560,000
Australia7140,000
Proxy locationDaysTotal requests
Global14280,000
EU (DE, ES, FR, IT, NL)14280,000
US14280,000
UK14280,000
India14280,000
Brazil14280,000
Australia7140,000
Proxy locationDaysTotal requests
US770,000

Infrastructure Performance Test

Our infrastructure performance test runs together with the proxy pool evaluation (they’re actually the same benchmark). However, it considers different parameters, in particular:

  1. Success rate – how many of the requests successfully reach the target website and return its response.
  2. Response time – how long this process takes.
  3. Proxy network’s stability – because we ping the proxy network nearly every second, we can track its down periods. 


The test aims to objectively measure a proxy network’s speed and availability, so it needs a target that a) is always available and b) won’t rate limit or block our requests. Our criteria are met by
Cloudflare’s Trace tool, which automatically connects to the nearest Cloudflare data center. The page weighs around six kilobytes

Rotating proxy networks usually have load balancers in various geographical locations. To more accurately capture their response time, we send requests from three testing servers located in Germany, the US, and Singapore, depending on the proxy country. Our requests to the unfiltered pool are made from Germany.

Proxy location Testing server’s location
Global Germany
EU (DE, ES, FR, IT, NL) Germany
US US
UK Germany
India Singapore
Brazil US
Australia Singapore

Performance with Popular Targets

We also test how well the proxy servers work with popular websites. Particular targets vary by proxy type, but our list usually includes a search engine (Google), an e-commerce website (Amazon), and a popular image-based social media network

Here, we use a different Python script based on HTTPX. It’s configured with custom headers, such as user-agents, to prevent blocks. Our scraper for the social media network is based on Puppeteer with a stealth plugin

We make between 2,000 and 4,000 connection requests to each target, rotating IPs with every request. The proxy server location is set to the US.

This is probably our least objective benchmark, as it strongly depends on the web scraper setup. However, it can still be useful in capturing outliers and especially in comparing several providers.

Download Speed Test

Datacenter and ISP proxy servers often have bigger bandwidth requirements per IP address than residential or mobile proxy networks. We evaluate their throughput by selecting at least 10 random proxies and downloading a large file

The targets we use are Hetzner’s 100 MB & 1 GB Ashburn speed tests. We draw the lowest, highest, and average download speed numbers and then compare them to the baseline speed without a proxy server.

Other Benchmarks

We may and sometimes do include additional benchmarks. They appear under this section because we haven’t standardized them yet. 

For example, 2024’s Proxy Market Research included what we call the IP Quality test. There, we borrowed the insights of a third-party service called IPQualityScore to gauge the level of abuse residential proxy networks have faced. 

More similar benchmarks may appear over time. Once they become entrenched, we’ll cover them in a separate section.

Questions & Feedback

If you have any questions or suggestions for improving our processes, feel free to reach out via info at proxyway dot com.