IPRoyal - Premium Proxy Service Provider
Back to blog

How to Build Your Own Best Buy Price Tracker Using Python

Eugenijus Denisov

Last updated -

How to

In This Article

Ready to get started?

Register now

Best Buy is one of the most popular stores for electronics and various gadgets, preferred by many professionals and enthusiasts across various hobbies and jobs. Many of the items sold in the store can be quite expensive, so many buyers are on a constant lookout for deals, price drops, or discounts.

Manually checking the online store can take up a lot of time, so various Best Buy price tracker tools have popped up. Some users have even built their Best Buy price tracker as the process isn’t all that difficult. We’ll show you exactly how to do that.

What Is a Best Buy Price Tracker?

A Best Buy price tracker is a tool that automatically scans the online store for prices of various goods and compares the acquired data to previous historical price data. Usually, if a discount, price reduction, or sale is captured, the user may be alerted.

Most of the price trackers are built upon web scraping, a process wherein an automated program downloads data from the website. That information is then cleaned and only the important bits are retained.

Price tracking tools will also include data such as URLs, product descriptions, and titles in conjunction with the pricing information. Additional metrics, such as the size of the discount or general price history, may be shown as well.

As mentioned previously, alerts are another common feature of a price tracker. Many users would like to set price threshold alerts – when a certain item falls below a set price point, an email or other notification is sent to inform the user.

All of these features and benefits are created with a single overarching goal – to inform users about price history, changes, and to create an opportunity to save money with price drop alerts.

What Are the Benefits of Tracking Prices at Best Buy?

The major reason anyone would want to track prices at Best Buy is to capture the best deals. These are extremely important for anyone that purchases tools professionally as that helps them maintain margins and reduce costs.

Additionally, price tracking lets users see whether it’s worth waiting for a discount. If the item has never been discounted in a long time, it’s unlikely (although not impossible) that there’s going to be a price drop in the future.

Finally, with several price trackers at once for the same products across different websites, users can make better purchasing decisions, again leading to major savings in both the long and short term.

Can You Set a Price Alert on Best Buy?

Price drop alerts are only available sometimes, usually during promotions or special deals. For regular usage, there’s no way to get alerted about price changes for individual products. That’s part of the reason why many people use third-party monitoring tools.

How to Build a Price Tracking Tool with Python

Python is a relatively accessible programming language that has a lot of libraries that make web scraping easier. We’ll be using it to create our price tracking tool and collect historical data.

Create a new project in your preferred IDE and start by installing all of the libraries we’ll be using:

pip install requests
pip install beautifulsoup4
pip install pandas
pip install schedule

All of these libraries will be used in turn. Let’s start with creating a process that’ll visit a Best Buy URL and download the HTML file:

import requests
from bs4 import BeautifulSoup

def get_price(url):
    headers = {
        "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.121 Safari/537.36"
    }
    response = requests.get(url, headers=headers)
    soup = BeautifulSoup(response.content, 'html.parser')

    # Assuming the price is within a specific HTML tag (inspect to confirm)
    price_tag = soup.find("div", {"class": "priceView-hero-price priceView-customer-price"})
    price = price_tag.find("span").get_text()
    return float(price.replace("$", "").replace(",", ""))

print(get_price('https://www.bestbuy.com/site/apple-10-2-inch-ipad-9th-generation-with-wi-fi-64gb-space-gray/4901809.p?skuId=4901809'))

We’ve picked a random item from Best Buy, but you can implement price tracking for any product.

Our code starts by defining a user agent that represents a regular browser. This is necessary as the requests library sends a default user agent that’s often detected and blocked by websites.

We then store the HTML content of our URL in a response object. It is then searched by using BeautifulSoup4 methods to find the price tag in the HTML and to get the text of the price. We then return a float value.

To debug the process, run a print function to see if everything’s working as intended.

However, for users outside of the US, you’ll need Best Buy proxies to change your location to one within the United States. Otherwise, Best Buy could block you by default.

Storing the Data Internally

We now need to store the data as it’s currently only saved in memory (and the standard output, if you used print). Pandas is perfect for data storage, so we’ll be using that to move our data to a CSV file:

import pandas as pd
from datetime import datetime

def save_to_csv(price, url):
    data = {
        'Date': [datetime.now()],
        'Price': [price],
        'URL': [url]
    }
    df = pd.DataFrame(data)
    df.to_csv('best_buy_prices.csv', mode='a', header=False, index=False)

While we’re showing pieces of code, make sure your imports are at the top of the file, and the definitions come after that. Don’t mix up pieces of code in the sequence that’s written in this blog post.

Our current function will take two arguments, the price (which we can retrieve by calling get_price()) and the URL, which we set ourselves. Then it’ll log the current time, the price, and the URL, turn it into a dataframe object and export it to a CSV file.

Your current code block should look something like this:

import requests
from bs4 import BeautifulSoup
import pandas as pd
from datetime import datetime

def get_price(url):
    headers = {
        "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.121 Safari/537.36"
    }
    response = requests.get(url, headers=headers)
    soup = BeautifulSoup(response.content, 'html.parser')

    # Assuming the price is within a specific HTML tag (inspect to confirm)
    price_tag = soup.find("div", {"class": "priceView-hero-price priceView-customer-price"})
    price = price_tag.find("span").get_text()
    return float(price.replace("$", "").replace(",", ""))

def save_to_csv(price, url):
    data = {
        'Date': [datetime.now()],
        'Price': [price],
        'URL': [url]
    }
    df = pd.DataFrame(data)
    df.to_csv('best_buy_prices.csv', mode='a', header=False, index=False)


my_item_price = get_price('https://www.bestbuy.com/site/apple-10-2-inch-ipad-9th-generation-with-wi-fi-64gb-space-gray/4901809.p?skuId=4901809')
save_to_csv(my_item_price, 'https://www.bestbuy.com/site/apple-10-2-inch-ipad-9th-generation-with-wi-fi-64gb-space-gray/4901809.p?skuId=4901809')

Setting up Email Alerts

For our next step, we’ll need to set up a burner account on Gmail that’ll be responsible for sending alerts. Once an account is registered, set up 2FA and then head over to the app password page .

You’ll need to set up an app password to log in to your account and use it in the following code:

import smtplib

def send_email(price, url, threshold):
    if price <= threshold:
        server = smtplib.SMTP('smtp.gmail.com', 587)
        server.starttls()
        server.login("[email protected]", "your_app_password")
        message = f"Subject: Price Alert!\n\nThe price of the item has dropped to ${price}.\nCheck it out here: {url}"
        server.sendmail("[email protected]", "[email protected]", message)
        server.quit()

While the code is largely self-explanatory, there’s an important caveat – if you intend to actually use such a price tracking tool, make sure to use dotenv to load your credentials instead of storing them directly in your code. We only provide such code for tutorial purposes – leaving sensitive information directly in code is a huge cybersecurity risk.

We’ll need to call the function and have it running permanently to be able to send you an email. That’ll be our next step.

Scheduling Jobs

Price tracking tools work best when deployed on a constant basis instead of running them manually each time. The schedule library is perfect for our task:

import schedule
import time

url = "https://www.bestbuy.com/site/apple-10-2-inch-ipad-9th-generation-with-wi-fi-64gb-space-gray/4901809.p?skuId=4901809"
threshold = 299.99  # Set your desired price threshold

def job():
    price = get_price(url)
    save_to_csv(price, url)
    send_email(price, url, threshold)

# Schedule to run daily
schedule.every().day.at("09:00").do(job)

while True:
    schedule.run_pending()
    time.sleep(60)

Since we now have a function called “job” that includes all of the previous ones within it, we no longer need to call each function separately. Additionally, the “while True” loop will run constantly every minute, checking whether there are any jobs to run.

A bit above that we do have a scheduled job that’s set to run every day at 9 AM.

Your full block of code should now look something like this:

import requests
from bs4 import BeautifulSoup
import pandas as pd
from datetime import datetime
import smtplib
import schedule
import time

# Set your Best Buy product URL and desired price threshold
url = "https://www.bestbuy.com/site/apple-10-2-inch-ipad-9th-generation-with-wi-fi-64gb-space-gray/4901809.p?skuId=4901809"
threshold = 299.99  # Set your desired price threshold

def get_price(url):
    headers = {
        "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.121 Safari/537.36"
    }
    response = requests.get(url, headers=headers)
    soup = BeautifulSoup(response.content, 'html.parser')

    # Locate the price within the HTML (use browser inspect tool to find the correct tag)
    price_tag = soup.find("div", {"class": "priceView-hero-price priceView-customer-price"})
    price = price_tag.find("span").get_text()
    print(float(price.replace("$", "").replace(",", "")))
    return float(price.replace("$", "").replace(",", ""))

def save_to_csv(price, url):
    data = {
        'Date': [datetime.now()],
        'Price': [price],
        'URL': [url]
    }
    df = pd.DataFrame(data)
    df.to_csv('best_buy_prices.csv', mode='a', header=False, index=False)

def send_email(price, url, threshold):
    if price <= threshold:
        server = smtplib.SMTP('smtp.gmail.com', 587)
        server.starttls()
        server.login("[email protected]", "your_app_password")
        message = f"Subject: Price Alert!\n\nThe price of the item has dropped to ${price}.\nCheck it out here: {url}"
        server.sendmail("[email protected]", "[email protected]", message)
        server.quit()
        print(f"Email sent! The price dropped to ${price}")

def job():
    price = get_price(url)
    save_to_csv(price, url)
    send_email(price, url, threshold)

# Schedule the tracker to run daily at a specified time
schedule.every().day.at("09:00").do(job)
while True:
    schedule.run_pending()
    time.sleep(60)

Note that if you want to test the entire block of code, you’ll have to set up an email account and either wait until 9 AM or simply call the job() function on your own.

Final Thoughts

That’s all that you need to do to get started. To track more than one product, you can create a list of URLs and iterate over it whenever a job runs to keep extracting information. Depending on your use case, you may also need to set different thresholds for each.

Create account

Author

Eugenijus Denisov

Senior Software Engineer

With over a decade of experience under his belt, Eugenijus has worked on a wide range of projects - from LMS (learning management system) to large-scale custom solutions for businesses and the medical sector. Proficient in PHP, Vue.js, Docker, MySQL, and TypeScript, Eugenijus is dedicated to writing high-quality code while fostering a collaborative team environment and optimizing work processes. Outside of work, you’ll find him running marathons and cycling challenging routes to recharge mentally and build self-confidence.

Learn More About Eugenijus Denisov
Share on

Related articles