How to Use a Proxy With Axios and Node.js
Tutorials

Vilius Dumcius
Axios is one of the most commonly used Node.js libraries for web scraping. It enables developers to download the contents of websites, which can later be parsed with a library like Cheerio .
By default, Axios exposes your IP address when you connect to a website. This can lead to repercussions such as an IP ban. To avoid that, web scrapers use proxies—servers that act as middlemen between the client and the server. They help you hide your web scraping activities and protect your IP address.
This article will show you how to use proxies together with Axios when scraping with Node.js.
How to Use Proxies in Node.js
This tutorial will show you how to use proxies in Node.js via Axios. Make sure to read our tutorial on how to use node-fetch proxy .
Setup
First, ensure that Node.js is installed on your device. If you don’t, you can use the official instructions to download and install it.
After that, install Axios and Cheerio with the following terminal command. Axios will be used for making web requests, while Cheerio will be used for the web scraping script example.
npm i axios cheerio
Then, create a new folder called axios_proxy and move inside that folder. Run the npm init command to create a new Node.js project. Finally, create a file called index.js and open it in your favorite code editor.
Basic HTTP Request With Axios
Here's how a basic HTTP request made with Axios looks.
const axios = require('axios')
const cheerio = require('cheerio');
axios.get('https://quotes.toscrape.com/')
.then((r) => {
console.log(r.data)
})
All the function above does is request the content of a web page (in this case, it's Quotes to Scrape ) and print it out.
If you use a library like Cheerio, you can parse the request to extract the necessary information.
const axios = require('axios')
const cheerio = require('cheerio');
axios.get('https://quotes.toscrape.com/')
.then((r) => {
const $ = cheerio.load(r.data);
const quote_blocks = $('.quote')
quotes = quote_blocks.map((_, quote_block) => {
text = $(quote_block).find('.text').text();
author = $(quote_block).find('.author').text();
return { 'text': text, 'author': author }
}).toArray();
console.log(quotes)
})
In the case of Quotes to Scrape, the website is made for web scraping. But a website like Amazon will make you fill out CAPTCHAs or log in if it detects that many requests have been made from the same address. For this reason, it's useful to add a proxy to your Axios requests.
Using a Proxy with Axios
To use a proxy server with Axios, you need to create a new variable that holds the value for the protocol, IP address, and port of the proxy that you want to connect to.
Here's an example:
proxy = {
protocol: 'http',
host: '176.193.113.206',
port: 8989
}
After that, you can use the proxy as an additional argument in the axios.get() request.
axios.get('https://quotes.toscrape.com/', {proxy: proxy})
Requests will now be funneled through the proxy that you provided.
Using SOCKS5 Proxy with Axios
Axios doesn’t support SOCKS proxies directly, but you can use the https-proxy-agent or socks-proxy-agent library to help with the setup. It comes in handy if your proxy config involves a SOCKS proxy server or SOCKS proxy credentials.
First, install the required package:
npm install socks-proxy-agent
Then use it like this:
const axios = require('axios');
const { SocksProxyAgent } = require('socks-proxy-agent');
const agent = new SocksProxyAgent('socks5://127.0.0.1:9050');
axios.get('https://quotes.toscrape.com/', {
httpAgent: agent,
httpsAgent: agent
}).then((res) => {
console.log(res.data);
}).catch((error) => {
console.error('Error occurred:', error.message);
console.error('Error code:', error.code);
if (error.response) {
console.error('Response status:', error.response.status);
}
});
This setup routes the specific HTTP request through the SOCKS5 proxy agent. It’s a great way to use a SOCKS proxy or a SOCKS proxy server for privacy or bypassing blocks.
Note that the code block has a default IP address. Replace it with your real SOCKS5 proxy IP address.
How to Find a Proxy?
Of course, there's a catch. If you're new to web scraping, you probably don't have access to a proxy.
There are two ways to find a proxy to connect to. You can either scour the internet for lists of free proxy servers or pay for access to a professional proxy server.
In the first case, the proxy servers you will find are likely to be slow, unsafe, unreliable, and they will also provide you with only one IP address. These issues can be addressed, but it takes quite a bit of time and expertise.
In the second case, the proxy service provider will provide you with a secure endpoint to connect to. It will also provide proxy rotation by default. This means that it can change your IP address on every request, masking both your IP address and the fact that any scraping is being done on the page at all!
In addition, since there is a great amount of competition between proxy services, good, reliable services can be gotten for a rather small price - around $7 per GB.
You will learn how to work with both types of proxies. So if you want to strike out on your own, that's all right.
However, if you are seeking a reliable proxy provider that offers hard-to-detect proxies that are ethically sourced from real devices worldwide, take a look at IPRoyal residential proxies . They are easy to set up and use, and the next section will provide an example of how you can use them with Axios.
Axios Proxy Authentication Example
Proxies provided by professional services usually require authentication. In addition to all the usual values used to define a proxy, you need to provide a username and password to prove that you have paid for their services.
If you purchase proxy services, the provider should supply you with all the necessary information: the host, port, username, and password you need to connect to the proxy. For example, if you use IPRoyal residential proxies , you can find the necessary information in your dashboard.
Now, you can put all the information provided into a proxy variable. Copy the code below and fill out the host, port, username, and password fields with the information that your provider has supplied.
proxy = {
protocol: 'http',
host: 'geo.iproyal.com',
port: 12321,
auth: {
username: 'cool username',
password: 'cool password'
}
}
After that, you can use the proxy as an argument in your axios.get() calls.
axios.get('https://quotes.toscrape.com/')
.then((r) => {
...
})
Here's an example of how a small web scraping script using an authenticated proxy and Axios might look:
const axios = require('axios')
const cheerio = require('cheerio');
proxy = {
protocol: 'http',
host: 'geo.iproyal.com',
port: 12321,
auth: {
username: 'cool username',
password: 'cool password'
}
}
axios.get('https://quotes.toscrape.com/', {proxy: proxy})
.then((r) => {
const $ = cheerio.load(r.data);
const quote_blocks = $('.quote')
quotes = quote_blocks.map((_, quote_block) => {
text = $(quote_block).find('.text').text();
author = $(quote_block).find('.author').text();
return { 'text': text, 'author': author }
}).toArray();
console.log(quotes)
})
Every request made through the authenticated proxy will go through a different IP address, which will help to avoid detection in situations when you need to scrape hundreds or thousands of pages.
Using httpAgent in Axios
If you’re not using a SOCKS proxy, but still want more control over your connection configuration, you can use Node’s built-in http.Agent. It provides more control over how connections are handled and reused, particularly for advanced use cases involving HTTP requests.
If you’re working with HTTPS requests, you’ll need to use https.Agent from Node’s https module. But in this case, we’ll stick to HTTP requests.
Here’s how to use the http.Agent:
const axios = require('axios');
const http = require('http');
const agent = new http.Agent({
keepAlive: true,
maxSockets: 10
});
axios.get('http://quotes.toscrape.com/', {
httpAgent: agent
}).then((res) => {
console.log(res.data);
});
It doesn’t configure a proxy on its own, but it works well alongside proxy server setups, especially when handling many HTTP requests or requiring performance optimization. It’s one of those tools you don’t always need, but it helps when you do.
Setting a Proxy via Environment Variables
Storing sensitive information—such as the username and password you use for a proxy—in code is not very secure. If you accidentally share the file with another person or put it on a public GitHub repository, the credentials will be exposed.
To fix that, this information is usually stored in environment variables — user-defined variables that are accessible to programs running on a computer.
Using the terminal, you can define HTTP_PROXY and HTTPS_PROXY environment variables, which include the link to your proxy, including the host, port, and (optionally) authentication details.
If you're using IPRoyal residential proxies, this link is accessible in your dashboard.
Copy the link and set it as an environment variable for both HTTP and HTTPS using the following commands if you're using Windows:
set HTTP_PROXY=http://username:password@host:port
set HTTPS_PROXY=http://username:password@host:port
If you're using Linux or MacOS, you need to use the export command instead of set:
export HTTP_PROXY=http://username:password@host:port
export HTTPS_PROXY=http://username:password@host:port
If you run your Axios web scraping script using this terminal, it will use the defined proxy by default. Axios automatically checks for HTTP_PROXY and HTTPS_PROXY environment variables and uses them as proxies if found.
Rotating Proxies
If you have plenty of proxy servers that you can use and you don't use a professional proxy service that provides proxy rotation by default, it's possible to create a working solution that picks and tries a random proxy from a list of possible options.
First, instead of creating one proxy variable, create an array with multiple proxies.
proxies = [
{
protocol: 'http',
host: '128.172.183.18',
port: 8000
},
{
protocol: 'http',
host: '18.4.13.6',
port: 8080
},
{
protocol: 'http',
host: '65.108.34.224',
port: 8888
}
]
Then, create a function that randomly selects one of the proxies.
function get_random_proxy(proxies) {
return proxies[Math.floor((Math.random() * proxies.length))];
}
Now you can call it on the proxies array to pick a random proxy to go through every time you want to make a request.
axios.get('https://quotes.toscrape.com/', { proxy: get_random_proxy(proxies) })
.then((r) => { …
Be careful, though: this solution doesn't account for the fact that proxy servers might break or shut down. If you're using proxy servers that are unreliable (can fail to return the requested information), you will also need to supply a retry mechanism that detects proxy failure.
Why Should You Use Proxies?
When connecting to a server, you share the IP address from which your request comes. If this is a one-off request, it doesn't matter.
However, if you plan to request multiple pages to scrape information from the website, your IP address may be flagged by a detection system or administrator and blacklisted. This means that you won't be able to access the website until you change your IP address.
Proxies act as middlemen and forward your requests to the server. The server sees the IP address of the proxy as the origin of the request, and your real IP address is known only to the proxy.
They enable you to scrape large amounts of data and make many more requests than you could with a single IP address. The best of them provide a pool of IP addresses through a single, authorized endpoint, enabling you to rotate IPs on request.
Conclusion
Using proxy services is a great way to enable large-scale web scraping, since it lets you hide web scraping activity from website administrators. If you're using Axios, setting up both unauthenticated and authenticated proxies for your web scraping projects is quite easy.
But sometimes, a proxy is not enough to look natural. In these cases, you can use a browser automation tool like Puppeteer to mimic a real user visiting the site. To learn more about this tool, refer to our extensive guide on Puppeteer .
FAQ
AxiosError: connect ETIMEDOUT
This error means that Axios timed out while trying to connect to a remote server. Depending on the server it failed to connect to (which is listed in the error), it means that either the page you're trying to connect to or the proxy is down.
To ensure that proxies don't shut down during the scraping process, it is essential to either use a reliable proxy provider or write code for rotating proxies and retry with a different one.
AxiosError: Request failed with status code 404
This error means that Axios was able to connect to a server of your choosing, but there was no resource to serve because the URL you provided to Axios was incorrect (didn't correspond to any web page on the server).
AxiosError: Request failed with status code 407
proxy = {
protocol: 'http',
host: 'geo.iproyal.com',
port: 12321,
auth: {
username: 'cool username',
password: 'cool password'
}
}
What is a proxy in the context of Axios?
A proxy server acts as a middleman. It takes your request, forwards it to the target site, and then sends the response back. When you use a proxy configuration with Axios, the target server sees the proxy’s IP instead of yours, which helps mask your identity if the setup is correct.
Working with HTTPS proxies is useful when scraping or needing to mask your location or identity.
Does Axios support HTTP and HTTPS proxies?
Yes, Axios supports both HTTP and HTTPS proxies in Node.js using the built-in proxy option. However, it doesn’t apply in the browser environment. You can provide the host, port, and even auth in the proxy configuration. If you’re working with a SOCKS proxy, you’ll need an external library, such as socks-proxy-agent, to assist you.
Can I disable the proxy in Axios?
Yes. If your environment has a default proxy configuration, but you want Axios to ignore it, you can set the proxy: false option like this:
axios.get('https://iproyal.com', { proxy: false });
That’ll make the request go straight out without using any proxy server.
Does Axios proxy work in the browser?
No. Axios doesn’t support the proxy option in the browser. Browser security rules don’t allow direct low-level proxy configuration. If you need to use a SOCKS proxy or HTTPS proxy, you’ll need to set it up in the browser itself or at the system/network level.