2023-09-18 19:02:12 -07:00
|
|
|
|
<img src="https://github.com/ZacharyHampton/HomeHarvest/assets/78247585/d1a2bf8b-09f5-4c57-b33a-0ada8a34f12d" width="400">
|
2023-09-17 13:10:21 -07:00
|
|
|
|
|
2023-10-04 06:58:55 -07:00
|
|
|
|
**HomeHarvest** is a simple, yet comprehensive, real estate scraping library that extracts and formats data in the style of MLS listings.
|
2023-09-15 19:47:46 -07:00
|
|
|
|
|
2023-09-18 10:01:52 -07:00
|
|
|
|
[![Try with Replit](https://replit.com/badge?caption=Try%20with%20Replit)](https://replit.com/@ZacharyHampton/HomeHarvestDemo)
|
|
|
|
|
|
2023-09-21 09:55:29 -07:00
|
|
|
|
\
|
|
|
|
|
**Not technical?** Try out the web scraping tool on our site at [tryhomeharvest.com](https://tryhomeharvest.com).
|
|
|
|
|
|
2023-09-18 17:35:38 -07:00
|
|
|
|
*Looking to build a data-focused software product?* **[Book a call](https://calendly.com/zachary-products/15min)** *to work with us.*
|
2023-09-21 09:55:29 -07:00
|
|
|
|
|
|
|
|
|
Check out another project we wrote: ***[JobSpy](https://github.com/cullenwatson/JobSpy)** – a Python package for job scraping*
|
|
|
|
|
|
2023-10-04 06:58:55 -07:00
|
|
|
|
## HomeHarvest Features
|
2023-09-18 17:35:38 -07:00
|
|
|
|
|
2023-10-04 06:58:55 -07:00
|
|
|
|
- **Source**: Fetches properties directly from **Realtor.com**.
|
|
|
|
|
- **Data Format**: Structures data to resemble MLS listings.
|
|
|
|
|
- **Export Flexibility**: Options to save as either CSV or Excel.
|
|
|
|
|
- **Usage Modes**:
|
|
|
|
|
- **CLI**: For users who prefer command-line operations.
|
|
|
|
|
- **Python**: For those who'd like to integrate scraping into their Python scripts.
|
2023-09-18 21:53:10 -07:00
|
|
|
|
|
2023-09-19 20:18:49 -07:00
|
|
|
|
[Video Guide for HomeHarvest](https://youtu.be/JnV7eR2Ve2o) - _updated for release v0.2.7_
|
2023-09-18 21:53:10 -07:00
|
|
|
|
|
2023-09-18 19:47:55 -07:00
|
|
|
|
![homeharvest](https://github.com/ZacharyHampton/HomeHarvest/assets/78247585/b3d5d727-e67b-4a9f-85d8-1e65fd18620a)
|
2023-09-18 17:35:38 -07:00
|
|
|
|
|
2023-09-18 08:38:56 -07:00
|
|
|
|
## Installation
|
2023-09-15 19:47:46 -07:00
|
|
|
|
|
|
|
|
|
```bash
|
2023-09-19 14:18:01 -07:00
|
|
|
|
pip install homeharvest
|
2023-09-17 13:10:21 -07:00
|
|
|
|
```
|
2023-09-18 17:35:38 -07:00
|
|
|
|
_Python version >= [3.10](https://www.python.org/downloads/release/python-3100/) required_
|
2023-09-19 13:01:39 -07:00
|
|
|
|
|
2023-09-18 17:35:38 -07:00
|
|
|
|
## Usage
|
2023-09-19 13:01:39 -07:00
|
|
|
|
|
|
|
|
|
### CLI
|
|
|
|
|
|
2023-10-03 22:31:23 -07:00
|
|
|
|
```
|
|
|
|
|
usage: homeharvest [-h] [-l {for_sale,for_rent,sold}] [-o {excel,csv}] [-f FILENAME] [-p PROXY] [-d DAYS] [-r RADIUS] location
|
|
|
|
|
|
|
|
|
|
Home Harvest Property Scraper
|
|
|
|
|
|
|
|
|
|
positional arguments:
|
|
|
|
|
location Location to scrape (e.g., San Francisco, CA)
|
|
|
|
|
|
|
|
|
|
options:
|
|
|
|
|
-l {for_sale,for_rent,sold}, --listing_type {for_sale,for_rent,sold}
|
|
|
|
|
Listing type to scrape
|
|
|
|
|
-o {excel,csv}, --output {excel,csv}
|
|
|
|
|
Output format
|
|
|
|
|
-f FILENAME, --filename FILENAME
|
|
|
|
|
Name of the output file (without extension)
|
|
|
|
|
-p PROXY, --proxy PROXY
|
|
|
|
|
Proxy to use for scraping
|
|
|
|
|
-d DAYS, --days DAYS Sold in last _ days filter.
|
|
|
|
|
-r RADIUS, --radius RADIUS
|
|
|
|
|
Get comparable properties within _ (eg. 0.0) miles. Only applicable for individual addresses.
|
|
|
|
|
```
|
2023-09-19 13:01:39 -07:00
|
|
|
|
```bash
|
2023-10-03 22:31:23 -07:00
|
|
|
|
> homeharvest "San Francisco, CA" -l for_rent -o excel -f HomeHarvest
|
2023-09-19 13:01:39 -07:00
|
|
|
|
```
|
|
|
|
|
|
2023-10-04 06:58:55 -07:00
|
|
|
|
### Python
|
|
|
|
|
|
2023-09-18 17:35:38 -07:00
|
|
|
|
```py
|
2023-10-04 06:58:55 -07:00
|
|
|
|
from homeharvest import scrape_property
|
|
|
|
|
from datetime import datetime
|
|
|
|
|
|
|
|
|
|
# Generate filename based on current timestamp
|
|
|
|
|
current_timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
|
|
|
|
|
filename = f"output/{current_timestamp}.csv"
|
|
|
|
|
|
|
|
|
|
properties = scrape_property(
|
|
|
|
|
location="San Diego, CA",
|
|
|
|
|
listing_type="sold", # for_sale, for_rent
|
|
|
|
|
)
|
|
|
|
|
print(f"Number of properties: {len(properties)}")
|
|
|
|
|
properties.to_csv(filename, index=False)
|
2023-09-17 13:10:21 -07:00
|
|
|
|
```
|
2023-09-18 08:38:56 -07:00
|
|
|
|
|
2023-10-04 06:58:55 -07:00
|
|
|
|
|
|
|
|
|
## Output
|
2023-09-18 17:35:38 -07:00
|
|
|
|
```plaintext
|
2023-10-04 06:58:55 -07:00
|
|
|
|
>>> properties.head()
|
|
|
|
|
MLS MLS # Status Style ... COEDate LotSFApx PrcSqft Stories
|
|
|
|
|
0 SDCA 230018348 SOLD CONDOS ... 2023-10-03 290110 803 2
|
|
|
|
|
1 SDCA 230016614 SOLD TOWNHOMES ... 2023-10-03 None 838 3
|
|
|
|
|
2 SDCA 230016367 SOLD CONDOS ... 2023-10-03 30056 649 1
|
|
|
|
|
3 MRCA NDP2306335 SOLD SINGLE_FAMILY ... 2023-10-03 7519 661 2
|
|
|
|
|
4 SDCA 230014532 SOLD CONDOS ... 2023-10-03 None 752 1
|
|
|
|
|
[5 rows x 22 columns]
|
|
|
|
|
```
|
|
|
|
|
|
|
|
|
|
### Parameters for `scrape_property()`
|
|
|
|
|
```
|
2023-09-18 17:35:38 -07:00
|
|
|
|
Required
|
|
|
|
|
├── location (str): address in various formats e.g. just zip, full address, city/state, etc.
|
|
|
|
|
└── listing_type (enum): for_rent, for_sale, sold
|
|
|
|
|
Optional
|
2023-10-04 06:58:55 -07:00
|
|
|
|
├── radius_for_comps (float): Radius in miles to find comparable properties based on individual addresses.
|
|
|
|
|
├── sold_last_x_days (int): Number of past days to filter sold properties.
|
|
|
|
|
├── proxy (str): in format 'http://user:pass@host:port'
|
2023-09-18 17:35:38 -07:00
|
|
|
|
```
|
|
|
|
|
### Property Schema
|
|
|
|
|
```plaintext
|
|
|
|
|
Property
|
|
|
|
|
├── Basic Information:
|
2023-10-04 06:58:55 -07:00
|
|
|
|
│ ├── property_url (str)
|
|
|
|
|
│ ├── mls (str)
|
|
|
|
|
│ ├── mls_id (str)
|
|
|
|
|
│ └── status (str)
|
2023-09-18 17:35:38 -07:00
|
|
|
|
|
|
|
|
|
├── Address Details:
|
2023-10-04 06:58:55 -07:00
|
|
|
|
│ ├── street (str)
|
|
|
|
|
│ ├── unit (str)
|
|
|
|
|
│ ├── city (str)
|
|
|
|
|
│ ├── state (str)
|
|
|
|
|
│ └── zip (str)
|
|
|
|
|
|
|
|
|
|
├── Property Description:
|
|
|
|
|
│ ├── style (str)
|
|
|
|
|
│ ├── beds (int)
|
|
|
|
|
│ ├── baths_full (int)
|
|
|
|
|
│ ├── baths_half (int)
|
|
|
|
|
│ ├── sqft (int)
|
|
|
|
|
│ ├── lot_sqft (int)
|
|
|
|
|
│ ├── sold_price (int)
|
|
|
|
|
│ ├── year_built (int)
|
|
|
|
|
│ ├── garage (float)
|
|
|
|
|
│ └── stories (int)
|
|
|
|
|
|
|
|
|
|
├── Property Listing Details:
|
|
|
|
|
│ ├── list_price (int)
|
|
|
|
|
│ ├── list_date (str)
|
|
|
|
|
│ ├── last_sold_date (str)
|
|
|
|
|
│ ├── prc_sqft (int)
|
|
|
|
|
│ └── hoa_fee (int)
|
|
|
|
|
|
|
|
|
|
├── Location Details:
|
|
|
|
|
│ ├── latitude (float)
|
|
|
|
|
│ ├── longitude (float)
|
|
|
|
|
│ └── neighborhoods (str)
|
2023-09-18 17:35:38 -07:00
|
|
|
|
```
|
|
|
|
|
## Supported Countries for Property Scraping
|
|
|
|
|
|
|
|
|
|
* **Realtor.com**: mainly from the **US** but also has international listings
|
|
|
|
|
|
|
|
|
|
### Exceptions
|
|
|
|
|
The following exceptions may be raised when using HomeHarvest:
|
|
|
|
|
|
|
|
|
|
- `InvalidListingType` - valid options: `for_sale`, `for_rent`, `sold`
|
|
|
|
|
- `NoResultsFound` - no properties found from your input
|
|
|
|
|
|
|
|
|
|
## Frequently Asked Questions
|
|
|
|
|
---
|
|
|
|
|
|
2023-10-04 06:58:55 -07:00
|
|
|
|
**Q: Encountering issues with your searches?**
|
|
|
|
|
**A:** Try to broaden the location. If problems persist, [submit an issue](https://github.com/ZacharyHampton/HomeHarvest/issues).
|
2023-09-18 17:35:38 -07:00
|
|
|
|
|
|
|
|
|
---
|
|
|
|
|
|
|
|
|
|
**Q: Received a Forbidden 403 response code?**
|
2023-10-04 06:58:55 -07:00
|
|
|
|
**A:** This indicates that you have been blocked by Realtor.com for sending too many requests. We recommend:
|
2023-09-18 08:38:56 -07:00
|
|
|
|
|
2023-09-18 17:35:38 -07:00
|
|
|
|
- Waiting a few seconds between requests.
|
2023-09-18 17:39:22 -07:00
|
|
|
|
- Trying a VPN to change your IP address.
|
2023-09-18 08:38:56 -07:00
|
|
|
|
|
2023-09-18 17:35:38 -07:00
|
|
|
|
---
|
2023-09-18 08:38:56 -07:00
|
|
|
|
|