[MIRROR] Python package for scraping real estate property data
Go to file
Cullen Watson e17b976923 fix postal search to search just by zip 2023-12-02 00:39:28 -06:00
.github/workflows - scrapers & pypi init 2023-09-15 13:06:14 -07:00
examples [chore] display clickable URLs in jupyter 2023-10-09 11:28:56 -05:00
homeharvest fix postal search to search just by zip 2023-12-02 00:39:28 -06:00
tests [enh] date_to and date_from 2023-11-03 18:40:34 -05:00
.gitignore feat(redfin): rental support 2023-09-19 11:58:20 -05:00
LICENSE Create LICENSE 2023-09-16 10:39:36 -07:00
README.md Update README.md 2023-11-30 11:48:48 -07:00
poetry.lock [docs] README 2023-10-05 10:03:21 -05:00
pyproject.toml fix postal search to search just by zip 2023-12-02 00:39:28 -06:00

README.md

HomeHarvest is a simple, yet comprehensive, real estate scraping library that extracts and formats data in the style of MLS listings.

Not technical? Try out the web scraping tool on our site at tryhomeharvest.com.

Looking to build a data-focused software product? Book a call to work with us.

HomeHarvest Features

  • Source: Fetches properties directly from Realtor.com.
  • Data Format: Structures data to resemble MLS listings.
  • Export Flexibility: Options to save as either CSV or Excel.
  • Usage Modes:
    • Python: For those who'd like to integrate scraping into their Python scripts.
    • CLI: For users who prefer command-line operations.

Video Guide for HomeHarvest - updated for release v0.3.4

homeharvest

Installation

pip install homeharvest

Python version >= 3.10 required

Usage

Python

from homeharvest import scrape_property
from datetime import datetime

# Generate filename based on current timestamp
current_timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
filename = f"HomeHarvest_{current_timestamp}.csv"

properties = scrape_property(
  location="San Diego, CA",
  listing_type="sold",  # or (for_sale, for_rent, pending)
  past_days=30,  # sold in last 30 days - listed in last 30 days if (for_sale, for_rent)
  
  # date_from="2023-05-01", # alternative to past_days 
  # date_to="2023-05-28", 
  
  # mls_only=True,  # only fetch MLS listings
  # proxy="http://user:pass@host:port"  # use a proxy to change your IP address
)
print(f"Number of properties: {len(properties)}")

# Export to csv
properties.to_csv(filename, index=False)
print(properties.head())

Output

>>> properties.head()
    MLS       MLS # Status          Style  ...     COEDate LotSFApx PrcSqft Stories
0  SDCA   230018348   SOLD         CONDOS  ...  2023-10-03   290110     803       2
1  SDCA   230016614   SOLD      TOWNHOMES  ...  2023-10-03     None     838       3
2  SDCA   230016367   SOLD         CONDOS  ...  2023-10-03    30056     649       1
3  MRCA  NDP2306335   SOLD  SINGLE_FAMILY  ...  2023-10-03     7519     661       2
4  SDCA   230014532   SOLD         CONDOS  ...  2023-10-03     None     752       1
[5 rows x 22 columns]

Parameters for scrape_property()

Required
├── location (str): The address in various formats - this could be just a zip code, a full address, or city/state, etc.
└── listing_type (option): Choose the type of listing.
    - 'for_rent'
    - 'for_sale'
    - 'sold'
    - 'pending'

Optional
├── radius (decimal): Radius in miles to find comparable properties based on individual addresses.
│    Example: 5.5 (fetches properties within a 5.5-mile radius if location is set to a specific address; otherwise, ignored)
│
├── past_days (integer): Number of past days to filter properties. Utilizes 'last_sold_date' for 'sold' listing types, and 'list_date' for others (for_rent, for_sale).
│    Example: 30 (fetches properties listed/sold in the last 30 days)
│
├── date_from, date_to (string): Start and end dates to filter properties listed or sold, both dates are required.
|    (use this to get properties in chunks as there's a 10k result limit)
│    Format for both must be "YYYY-MM-DD". 
│    Example: "2023-05-01", "2023-05-15" (fetches properties listed/sold between these dates)
│
├── mls_only (True/False): If set, fetches only MLS listings (mainly applicable to 'sold' listings)
│
└── proxy (string): In format 'http://user:pass@host:port'


CLI

usage: homeharvest [-l {for_sale,for_rent,sold}] [-o {excel,csv}] [-f FILENAME] [-p PROXY] [-d DAYS] [-r RADIUS] [-m] [-c] location
                                                                                                                             
Home Harvest Property Scraper                                                                                                 
                                                                                                                             
positional arguments:                                                                                                         
  location              Location to scrape (e.g., San Francisco, CA)                                                          
                                                                                                                             
options:                                                                                                                      
  -l {for_sale,for_rent,sold,pending}, --listing_type {for_sale,for_rent,sold,pending}                                                        
                        Listing type to scrape                                                                                
  -o {excel,csv}, --output {excel,csv}                                                                                        
                        Output format                                                                                         
  -f FILENAME, --filename FILENAME                                                                                            
                        Name of the output file (without extension)                                                           
  -p PROXY, --proxy PROXY                                                                                                     
                        Proxy to use for scraping                                                                             
  -d DAYS, --days DAYS  Sold/listed in last _ days filter.                                                                           
  -r RADIUS, --radius RADIUS                                                                                                  
                        Get comparable properties within _ (e.g., 0.0) miles. Only applicable for individual addresses.        
  -m, --mls_only        If set, fetches only MLS listings.                                                                    
homeharvest "San Francisco, CA" -l for_rent -o excel -f HomeHarvest

Property Schema

Property
├── Basic Information:
│ ├── property_url
│ ├── mls
│ ├── mls_id
│ └── status

├── Address Details:
│ ├── street
│ ├── unit
│ ├── city
│ ├── state
│ └── zip_code

├── Property Description:
│ ├── style
│ ├── beds
│ ├── full_baths
│ ├── half_baths
│ ├── sqft
│ ├── year_built
│ ├── stories
│ └── lot_sqft

├── Property Listing Details:
│ ├── days_on_mls
│ ├── list_price
│ ├── list_date
│ ├── sold_price
│ ├── last_sold_date
│ ├── price_per_sqft
│ └── hoa_fee

├── Location Details:
│ ├── latitude
│ ├── longitude

└── Parking Details:
    └── parking_garage

Exceptions

The following exceptions may be raised when using HomeHarvest:

  • InvalidListingType - valid options: for_sale, for_rent, sold
  • InvalidDate - date_from or date_to is not in the format YYYY-MM-DD

Frequently Asked Questions


Q: Encountering issues with your searches?
A: Try to broaden the parameters you're using. If problems persist, submit an issue.


Q: Received a Forbidden 403 response code?
A: This indicates that you have been blocked by Realtor.com for sending too many requests. We recommend:

  • Waiting a few seconds between requests.
  • Trying a VPN or using a proxy as a parameter to scrape_property() to change your IP address.