[enh]: make last_x_days generic

add mls_only
make radius generic
pull/31/head
Cullen Watson 2023-10-04 10:11:53 -05:00
parent 51bde20c3c
commit c4870677c2
9 changed files with 220 additions and 201 deletions

View File

@ -36,7 +36,7 @@ pip install homeharvest
### CLI ### CLI
``` ```
usage: homeharvest [-h] [-l {for_sale,for_rent,sold}] [-o {excel,csv}] [-f FILENAME] [-p PROXY] [-d DAYS] [-r RADIUS] location usage: homeharvest [-l {for_sale,for_rent,sold}] [-o {excel,csv}] [-f FILENAME] [-p PROXY] [-d DAYS] [-r RADIUS] [-m] location
Home Harvest Property Scraper Home Harvest Property Scraper
@ -55,6 +55,7 @@ options:
-d DAYS, --days DAYS Sold in last _ days filter. -d DAYS, --days DAYS Sold in last _ days filter.
-r RADIUS, --radius RADIUS -r RADIUS, --radius RADIUS
Get comparable properties within _ (eg. 0.0) miles. Only applicable for individual addresses. Get comparable properties within _ (eg. 0.0) miles. Only applicable for individual addresses.
-m, --mls_only If set, fetches only MLS listings.
``` ```
```bash ```bash
> homeharvest "San Francisco, CA" -l for_rent -o excel -f HomeHarvest > homeharvest "San Francisco, CA" -l for_rent -o excel -f HomeHarvest
@ -73,9 +74,14 @@ filename = f"output/{current_timestamp}.csv"
properties = scrape_property( properties = scrape_property(
location="San Diego, CA", location="San Diego, CA",
listing_type="sold", # for_sale, for_rent listing_type="sold", # for_sale, for_rent
last_x_days=30, # sold/listed in last 30 days
mls_only=True, # only fetch MLS listings
) )
print(f"Number of properties: {len(properties)}") print(f"Number of properties: {len(properties)}")
# Export to csv
properties.to_csv(filename, index=False) properties.to_csv(filename, index=False)
print(properties.head())
``` ```
@ -94,12 +100,23 @@ properties.to_csv(filename, index=False)
### Parameters for `scrape_property()` ### Parameters for `scrape_property()`
``` ```
Required Required
├── location (str): address in various formats e.g. just zip, full address, city/state, etc. ├── location (str): The address in various formats - this could be just a zip code, a full address, or city/state, etc.
└── listing_type (enum): for_rent, for_sale, sold └── listing_type (option): Choose the type of listing.
- 'for_rent'
- 'for_sale'
- 'sold'
Optional Optional
├── radius_for_comps (float): Radius in miles to find comparable properties based on individual addresses. ├── radius (decimal): Radius in miles to find comparable properties based on individual addresses.
├── sold_last_x_days (int): Number of past days to filter sold properties. │ Example: 5.5 (fetches properties within a 5.5-mile radius if location is set to a specific address; otherwise, ignored)
├── proxy (str): in format 'http://user:pass@host:port'
├── last_x_days (integer): Number of past days to filter properties. Utilizes 'COEDate' for 'sold' listing types, and 'Lst Date' for others (for_rent, for_sale).
│ Example: 30 (fetches properties listed/sold in the last 30 days)
├── mls_only (True/False): If set, fetches only MLS listings (mainly applicable to 'sold' listings)
└── proxy (string): In format 'http://user:pass@host:port'
``` ```
### Property Schema ### Property Schema
```plaintext ```plaintext
@ -111,51 +128,49 @@ Property
│ └── status (str) │ └── status (str)
├── Address Details: ├── Address Details:
│ ├── street (str) │ ├── street
│ ├── unit (str) │ ├── unit
│ ├── city (str) │ ├── city
│ ├── state (str) │ ├── state
│ └── zip (str) │ └── zip
├── Property Description: ├── Property Description:
│ ├── style (str) │ ├── style
│ ├── beds (int) │ ├── beds
│ ├── baths_full (int) │ ├── baths_full
│ ├── baths_half (int) │ ├── baths_half
│ ├── sqft (int) │ ├── sqft
│ ├── lot_sqft (int) │ ├── lot_sqft
│ ├── sold_price (int) │ ├── sold_price
│ ├── year_built (int) │ ├── year_built
│ ├── garage (float) │ ├── garage
│ └── stories (int) │ └── stories
├── Property Listing Details: ├── Property Listing Details:
│ ├── list_price (int) │ ├── list_price
│ ├── list_date (str) │ ├── list_date
│ ├── last_sold_date (str) │ ├── last_sold_date
│ ├── prc_sqft (int) │ ├── prc_sqft
│ └── hoa_fee (int) │ └── hoa_fee
├── Location Details: ├── Location Details:
│ ├── latitude (float) │ ├── latitude
│ ├── longitude (float) │ ├── longitude
│ └── neighborhoods (str) │ └── neighborhoods
``` ```
## Supported Countries for Property Scraping
* **Realtor.com**: mainly from the **US** but also has international listings
### Exceptions ### Exceptions
The following exceptions may be raised when using HomeHarvest: The following exceptions may be raised when using HomeHarvest:
- `InvalidListingType` - valid options: `for_sale`, `for_rent`, `sold` - `InvalidListingType` - valid options: `for_sale`, `for_rent`, `sold`
- `NoResultsFound` - no properties found from your input - `NoResultsFound` - no properties found from your search
## Frequently Asked Questions ## Frequently Asked Questions
--- ---
**Q: Encountering issues with your searches?** **Q: Encountering issues with your searches?**
**A:** Try to broaden the location. If problems persist, [submit an issue](https://github.com/ZacharyHampton/HomeHarvest/issues). **A:** Try to broaden the parameters you're using. If problems persist, [submit an issue](https://github.com/ZacharyHampton/HomeHarvest/issues).
--- ---
@ -163,7 +178,7 @@ The following exceptions may be raised when using HomeHarvest:
**A:** This indicates that you have been blocked by Realtor.com for sending too many requests. We recommend: **A:** This indicates that you have been blocked by Realtor.com for sending too many requests. We recommend:
- Waiting a few seconds between requests. - Waiting a few seconds between requests.
- Trying a VPN to change your IP address. - Trying a VPN or useing a proxy as a parameter to scrape_property() to change your IP address.
--- ---

View File

@ -31,7 +31,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"# scrapes all 3 sites by default\n", "# check for sale properties\n",
"scrape_property(\n", "scrape_property(\n",
" location=\"dallas\",\n", " location=\"dallas\",\n",
" listing_type=\"for_sale\"\n", " listing_type=\"for_sale\"\n",
@ -53,7 +53,6 @@
"# search a specific address\n", "# search a specific address\n",
"scrape_property(\n", "scrape_property(\n",
" location=\"2530 Al Lipscomb Way\",\n", " location=\"2530 Al Lipscomb Way\",\n",
" site_name=\"zillow\",\n",
" listing_type=\"for_sale\"\n", " listing_type=\"for_sale\"\n",
")" ")"
] ]
@ -68,7 +67,6 @@
"# check rentals\n", "# check rentals\n",
"scrape_property(\n", "scrape_property(\n",
" location=\"chicago, illinois\",\n", " location=\"chicago, illinois\",\n",
" site_name=[\"redfin\", \"zillow\"],\n",
" listing_type=\"for_rent\"\n", " listing_type=\"for_rent\"\n",
")" ")"
] ]
@ -88,7 +86,6 @@
"# check sold properties\n", "# check sold properties\n",
"scrape_property(\n", "scrape_property(\n",
" location=\"90210\",\n", " location=\"90210\",\n",
" site_name=[\"redfin\"],\n",
" listing_type=\"sold\"\n", " listing_type=\"sold\"\n",
")" ")"
] ]

View File

@ -0,0 +1,18 @@
from homeharvest import scrape_property
from datetime import datetime
# Generate filename based on current timestamp
current_timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
filename = f"output/{current_timestamp}.csv"
properties = scrape_property(
location="San Diego, CA",
listing_type="sold", # for_sale, for_rent
last_x_days=30, # sold/listed in last 30 days
mls_only=True, # only fetch MLS listings
)
print(f"Number of properties: {len(properties)}")
# Export to csv
properties.to_csv(filename, index=False)
print(properties.head())

View File

@ -1,103 +1,41 @@
import warnings
import pandas as pd import pandas as pd
import concurrent.futures
from concurrent.futures import ThreadPoolExecutor
from .core.scrapers import ScraperInput from .core.scrapers import ScraperInput
from .utils import process_result, ordered_properties from .utils import process_result, ordered_properties, validate_input
from .core.scrapers.realtor import RealtorScraper from .core.scrapers.realtor import RealtorScraper
from .core.scrapers.models import ListingType, Property, SiteName from .core.scrapers.models import ListingType
from .exceptions import InvalidListingType from .exceptions import InvalidListingType, NoResultsFound
_scrapers = {
"realtor.com": RealtorScraper,
}
def _validate_input(listing_type: str) -> None:
if listing_type.upper() not in ListingType.__members__:
raise InvalidListingType(f"Provided listing type, '{listing_type}', does not exist.")
def _scrape_single_site(location: str, site_name: str, listing_type: str, radius: float, proxy: str = None, sold_last_x_days: int = None) -> pd.DataFrame:
"""
Helper function to scrape a single site.
"""
_validate_input(listing_type)
scraper_input = ScraperInput(
location=location,
listing_type=ListingType[listing_type.upper()],
site_name=SiteName.get_by_value(site_name.lower()),
proxy=proxy,
radius=radius,
sold_last_x_days=sold_last_x_days
)
site = _scrapers[site_name.lower()](scraper_input)
results = site.search()
print(f"found {len(results)}")
properties_dfs = [process_result(result) for result in results]
if not properties_dfs:
return pd.DataFrame()
return pd.concat(properties_dfs, ignore_index=True, axis=0)[ordered_properties]
def scrape_property( def scrape_property(
location: str, location: str,
listing_type: str = "for_sale", listing_type: str = "for_sale",
radius: float = None, radius: float = None,
sold_last_x_days: int = None, mls_only: bool = False,
last_x_days: int = None,
proxy: str = None, proxy: str = None,
) -> pd.DataFrame: ) -> pd.DataFrame:
""" """
Scrape properties from Realtor.com based on a given location and listing type. Scrape properties from Realtor.com based on a given location and listing type.
:param location: US Location (e.g. 'San Francisco, CA', 'Cook County, IL', '85281', '2530 Al Lipscomb Way')
:param listing_type: Listing type (e.g. 'for_sale', 'for_rent', 'sold'). Default is 'for_sale'.
:param radius: Radius in miles to find comparable properties on individual addresses. Optional.
:param sold_last_x_days: Number of past days to filter sold properties. Optional.
:param proxy: Proxy IP address to be used for scraping. Optional.
:returns: pd.DataFrame containing properties
""" """
site_name = "realtor.com" validate_input(listing_type)
if site_name is None: scraper_input = ScraperInput(
site_name = list(_scrapers.keys()) location=location,
listing_type=ListingType[listing_type.upper()],
proxy=proxy,
radius=radius,
mls_only=mls_only,
last_x_days=last_x_days,
)
if not isinstance(site_name, list): site = RealtorScraper(scraper_input)
site_name = [site_name] results = site.search()
results = [] properties_dfs = [process_result(result) for result in results]
if not properties_dfs:
raise NoResultsFound("no results found for the query")
if len(site_name) == 1: with warnings.catch_warnings():
final_df = _scrape_single_site(location, site_name[0], listing_type, radius, proxy, sold_last_x_days) warnings.simplefilter("ignore", category=FutureWarning)
results.append(final_df) return pd.concat(properties_dfs, ignore_index=True, axis=0)[ordered_properties]
else:
with ThreadPoolExecutor() as executor:
futures = {
executor.submit(_scrape_single_site, location, s_name, listing_type, radius, proxy, sold_last_x_days): s_name
for s_name in site_name
}
for future in concurrent.futures.as_completed(futures):
result = future.result()
results.append(result)
results = [df for df in results if not df.empty and not df.isna().all().all()]
if not results:
return pd.DataFrame()
final_df = pd.concat(results, ignore_index=True)
columns_to_track = ["Street", "Unit", "Zip"]
#: validate they exist, otherwise create them
for col in columns_to_track:
if col not in final_df.columns:
final_df[col] = None
return final_df

View File

@ -5,7 +5,9 @@ from homeharvest import scrape_property
def main(): def main():
parser = argparse.ArgumentParser(description="Home Harvest Property Scraper") parser = argparse.ArgumentParser(description="Home Harvest Property Scraper")
parser.add_argument("location", type=str, help="Location to scrape (e.g., San Francisco, CA)") parser.add_argument(
"location", type=str, help="Location to scrape (e.g., San Francisco, CA)"
)
parser.add_argument( parser.add_argument(
"-l", "-l",
@ -33,21 +35,41 @@ def main():
help="Name of the output file (without extension)", help="Name of the output file (without extension)",
) )
parser.add_argument("-p", "--proxy", type=str, default=None, help="Proxy to use for scraping") parser.add_argument(
parser.add_argument("-d", "--days", type=int, default=None, help="Sold in last _ days filter.") "-p", "--proxy", type=str, default=None, help="Proxy to use for scraping"
)
parser.add_argument(
"-d",
"--days",
type=int,
default=None,
help="Sold/listed in last _ days filter.",
)
parser.add_argument( parser.add_argument(
"-r", "-r",
"--sold-properties-radius", "--radius",
dest="sold_properties_radius", # This makes sure the parsed argument is stored as radius_for_comps in args
type=float, type=float,
default=None, default=None,
help="Get comparable properties within _ (eg. 0.0) miles. Only applicable for individual addresses." help="Get comparable properties within _ (eg. 0.0) miles. Only applicable for individual addresses.",
)
parser.add_argument(
"-m",
"--mls_only",
action="store_true",
help="If set, fetches only MLS listings.",
) )
args = parser.parse_args() args = parser.parse_args()
result = scrape_property(args.location, args.listing_type, radius_for_comps=args.radius_for_comps, proxy=args.proxy) result = scrape_property(
args.location,
args.listing_type,
radius=args.radius,
proxy=args.proxy,
mls_only=args.mls_only,
last_x_days=args.days,
)
if not args.filename: if not args.filename:
timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S") timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")

View File

@ -8,14 +8,18 @@ from .models import Property, ListingType, SiteName
class ScraperInput: class ScraperInput:
location: str location: str
listing_type: ListingType listing_type: ListingType
site_name: SiteName
radius: float | None = None radius: float | None = None
mls_only: bool | None = None
proxy: str | None = None proxy: str | None = None
sold_last_x_days: int | None = None last_x_days: int | None = None
class Scraper: class Scraper:
def __init__(self, scraper_input: ScraperInput, session: requests.Session | tls_client.Session = None): def __init__(
self,
scraper_input: ScraperInput,
session: requests.Session | tls_client.Session = None,
):
self.location = scraper_input.location self.location = scraper_input.location
self.listing_type = scraper_input.listing_type self.listing_type = scraper_input.listing_type
@ -30,9 +34,9 @@ class Scraper:
self.session.proxies.update(proxies) self.session.proxies.update(proxies)
self.listing_type = scraper_input.listing_type self.listing_type = scraper_input.listing_type
self.site_name = scraper_input.site_name
self.radius = scraper_input.radius self.radius = scraper_input.radius
self.sold_last_x_days = scraper_input.sold_last_x_days self.last_x_days = scraper_input.last_x_days
self.mls_only = scraper_input.mls_only
def search(self) -> list[Property]: def search(self) -> list[Property]:
... ...

View File

@ -106,12 +106,16 @@ class RealtorScraper(Scraper):
Property( Property(
mls_id=property_id, mls_id=property_id,
property_url=f"{self.PROPERTY_URL}{property_info['details']['permalink']}", property_url=f"{self.PROPERTY_URL}{property_info['details']['permalink']}",
address=self._parse_address(property_info, search_type="handle_address"), address=self._parse_address(
description=self._parse_description(property_info) property_info, search_type="handle_address"
),
description=self._parse_description(property_info),
) )
] ]
def general_search(self, variables: dict, search_type: str) -> Dict[str, Union[int, list[Property]]]: def general_search(
self, variables: dict, search_type: str
) -> Dict[str, Union[int, list[Property]]]:
""" """
Handles a location area & returns a list of properties Handles a location area & returns a list of properties
""" """
@ -169,17 +173,23 @@ class RealtorScraper(Scraper):
} }
}""" }"""
sold_date_param = ('sold_date: { min: "$today-%sD" }' % self.sold_last_x_days date_param = (
if self.listing_type == ListingType.SOLD and self.sold_last_x_days 'sold_date: { min: "$today-%sD" }' % self.last_x_days
else "") if self.listing_type == ListingType.SOLD and self.last_x_days
sort_param = ('sort: [{ field: sold_date, direction: desc }]' else (
'list_date: { min: "$today-%sD" }' % self.last_x_days
if self.last_x_days
else ""
)
)
sort_param = (
"sort: [{ field: sold_date, direction: desc }]"
if self.listing_type == ListingType.SOLD if self.listing_type == ListingType.SOLD
else 'sort: [{ field: list_date, direction: desc }]') else "sort: [{ field: list_date, direction: desc }]"
)
if search_type == "comps": if search_type == "comps":
print('general - comps') query = """query Property_search(
query = (
"""query Property_search(
$coordinates: [Float]! $coordinates: [Float]!
$radius: String! $radius: String!
$offset: Int!, $offset: Int!,
@ -198,15 +208,12 @@ class RealtorScraper(Scraper):
offset: $offset offset: $offset
) %s""" % ( ) %s""" % (
self.listing_type.value.lower(), self.listing_type.value.lower(),
sold_date_param, date_param,
sort_param, sort_param,
results_query results_query,
)
) )
else: else:
print('general - not comps') query = """query Home_search(
query = (
"""query Home_search(
$city: String, $city: String,
$county: [String], $county: [String],
$state_code: String, $state_code: String,
@ -225,13 +232,11 @@ class RealtorScraper(Scraper):
%s %s
limit: 200 limit: 200
offset: $offset offset: $offset
) %s""" ) %s""" % (
% (
self.listing_type.value.lower(), self.listing_type.value.lower(),
sold_date_param, date_param,
sort_param, sort_param,
results_query results_query,
)
) )
payload = { payload = {
@ -264,32 +269,44 @@ class RealtorScraper(Scraper):
else None else None
) )
if not mls: if not mls and self.mls_only:
continue continue
able_to_get_lat_long = result and result.get("location") and result["location"].get("address") and result["location"]["address"].get("coordinate") able_to_get_lat_long = (
result
and result.get("location")
and result["location"].get("address")
and result["location"]["address"].get("coordinate")
)
realty_property = Property( realty_property = Property(
mls=mls, mls=mls,
mls_id=result["source"].get("listing_id") if "source" in result and isinstance(result["source"], dict) else None, mls_id=result["source"].get("listing_id")
if "source" in result and isinstance(result["source"], dict)
else None,
property_url=f"{self.PROPERTY_URL}{result['property_id']}", property_url=f"{self.PROPERTY_URL}{result['property_id']}",
status=result["status"].upper(), status=result["status"].upper(),
list_price=result["list_price"], list_price=result["list_price"],
list_date=result["list_date"].split("T")[0] if result.get("list_date") else None, list_date=result["list_date"].split("T")[0]
if result.get("list_date")
else None,
prc_sqft=result.get("price_per_sqft"), prc_sqft=result.get("price_per_sqft"),
last_sold_date=result.get("last_sold_date"), last_sold_date=result.get("last_sold_date"),
hoa_fee=result["hoa"]["fee"] if result.get("hoa") and isinstance(result["hoa"], dict) else None, hoa_fee=result["hoa"]["fee"]
latitude=result["location"]["address"]["coordinate"].get("lat") if able_to_get_lat_long else None, if result.get("hoa") and isinstance(result["hoa"], dict)
longitude=result["location"]["address"]["coordinate"].get("lon") if able_to_get_lat_long else None, else None,
latitude=result["location"]["address"]["coordinate"].get("lat")
if able_to_get_lat_long
else None,
longitude=result["location"]["address"]["coordinate"].get("lon")
if able_to_get_lat_long
else None,
address=self._parse_address(result, search_type="general_search"), address=self._parse_address(result, search_type="general_search"),
neighborhoods=self._parse_neighborhoods(result), neighborhoods=self._parse_neighborhoods(result),
description=self._parse_description(result) description=self._parse_description(result),
) )
properties.append(realty_property) properties.append(realty_property)
# print(response_json["data"]["property_search"], variables["offset"])
# print(response_json["data"]["home_search"]["total"], variables["offset"])
return { return {
"total": response_json["data"][search_key]["total"], "total": response_json["data"][search_key]["total"],
"properties": properties, "properties": properties,
@ -304,7 +321,6 @@ class RealtorScraper(Scraper):
} }
search_type = "comps" if self.radius and location_type == "address" else "area" search_type = "comps" if self.radius and location_type == "address" else "area"
print(search_type)
if location_type == "address": if location_type == "address":
if not self.radius: #: single address search, non comps if not self.radius: #: single address search, non comps
property_id = location_info["mpr_id"] property_id = location_info["mpr_id"]
@ -370,10 +386,10 @@ class RealtorScraper(Scraper):
) )
return Address( return Address(
street=f"{result['address']['street_number']} {result['address']['street_name']} {result['address']['street_suffix']}", street=f"{result['address']['street_number']} {result['address']['street_name']} {result['address']['street_suffix']}",
unit=result['address']['unit'], unit=result["address"]["unit"],
city=result['address']['city'], city=result["address"]["city"],
state=result['address']['state_code'], state=result["address"]["state_code"],
zip=result['address']['postal_code'], zip=result["address"]["postal_code"],
) )
@staticmethod @staticmethod

View File

@ -1,4 +1,4 @@
from .core.scrapers.models import Property from .core.scrapers.models import Property, ListingType
import pandas as pd import pandas as pd
ordered_properties = [ ordered_properties = [
@ -74,3 +74,10 @@ def process_result(result: Property) -> pd.DataFrame:
properties_df = properties_df.reindex(columns=ordered_properties) properties_df = properties_df.reindex(columns=ordered_properties)
return properties_df[ordered_properties] return properties_df[ordered_properties]
def validate_input(listing_type: str) -> None:
if listing_type.upper() not in ListingType.__members__:
raise InvalidListingType(
f"Provided listing type, '{listing_type}', does not exist."
)

View File

@ -27,7 +27,9 @@ def test_realtor_last_x_days_sold():
location="Dallas, TX", listing_type="sold", sold_last_x_days=10 location="Dallas, TX", listing_type="sold", sold_last_x_days=10
) )
assert all([result is not None for result in [days_result_30, days_result_10]]) and len(days_result_30) != len(days_result_10) assert all(
[result is not None for result in [days_result_30, days_result_10]]
) and len(days_result_30) != len(days_result_10)
def test_realtor_single_property(): def test_realtor_single_property():
@ -39,7 +41,7 @@ def test_realtor_single_property():
scrape_property( scrape_property(
location="2530 Al Lipscomb Way", location="2530 Al Lipscomb Way",
listing_type="for_sale", listing_type="for_sale",
) ),
] ]
assert all([result is not None for result in results]) assert all([result is not None for result in results])