The following is an excerpt from a white paper published by ATTOM Data Solutions. The full 10-page white paper with exclusive data and charts from ATTOM, along with more insights from thought leaders in predictive analytics, is available for free download here.
Real estate used to be a game of hunches. People bought and sold property because they had a sense of pricing, timing, and marketplace trends. Mortgages were made in large measure on the basis of past performance.
Today hunches are out, big data is in, and the artificial intelligence revolution is taking the real estate world by storm as the use of predictive analytics comes with the promise of better leads and early access to future inventory — translating into lower costs, less risk and bigger profits for the industry.
The drive away from housing market hunches to sophisticated predictive analytics based on big data principles is led by a growing group of housing precogs that are relative newcomers to the industry with strong ties to Silicon Valley and funded largely by venture capital.
“We use predictive analytics and machine learning to analyze how likely a homeowner is to sell in the near future,” said Avi Gupta, President and CEO at SmartZip Analytics. “These techniques look at historical data — who has sold in the past — to identify, from several thousand data attributes, which ones may have been a factor in triggering those sales. And then, they look for owners that exhibit similar triggers to predict who is more likely to sell in the future.”
Gupta added that “real estate is truly hyper-local, in that, the triggers that matter in a given neighborhood block can be different from the one next door, or even across the street. And these triggers can change from time to time even for the same neighborhood block. Hence, we have had to build hundreds of predictive models that look for various combinations of triggers to find the one that is the most accurate for each neighborhood across the country.”
Personal Data Dossiers
Back in 1971 — when many MLS brokers carried printed 3×5 cards to show inventory — the playwright Arthur Miller wrote that “too many information handlers seem to measure a man by the number of bits of storage capacity his dossier will occupy.”
Now such dossiers are far larger, vast electronic collections which detail our preferences in excruciating detail. Not just a tidbit here and there, but encyclopedic volumes of data ceaselessly gathered with clicks, links, cookies, tracking pixels, surveys, cell phone locators, loyalty programs, credit card purchases, and other collection techniques.
Companies, governments, and data brokers are accumulating unheard of volumes of data. Forget about gigabytes, petabytes, and exabytes. We’ve hit zettabytes — a measure equal to one trillion gigabytes.
“By 2025 the global datasphere will grow to 163 zettabytes,” says IDC. “That’s ten times the 16.1ZB of data generated in 2016. All this data will unlock unique user experiences and a new world of business opportunities.”
While data by itself has some innate value, it becomes exponentially more valuable for predictive analytics when sorted and analyzed with artificial intelligence.
“Generally speaking,” explains Alex Villacorta, EVP and chief economist at HouseCanary. “the growth of data across every part of the economy and our personal lives has provided us predictive modelers the ability to better understand how various pieces of a person’s life affect their decision-making. Everything is now on the table, from our social activity to current headline news to the types of products we buy online.”
“For a growing number of industries,” says McKinsey & Company, “AI is tilting the playing field – you’ll need to understand how before your competitors do.”
Proper Care & Feeding of AI
Data is just part of the equation — and a relatively small part at that — when it comes to applying AI principles to predicting future real estate transactions, according to Brad McDaniel Co-Founder and CEO of Likely.AI, a company that provides AI-driven leads to the real estate and mortgage industries.
“With the most advanced version of AI, called deep learning, which is what we use, only 10 percent of the final prediction decision is determined by the data itself,” he said. “That is because 90 percent of the predictive power comes from the extremely complicated interactions between the layers of neurons within the deep neural networks that we have created. We now live in a time where data availability is everywhere, but what you do with it is where the magic happens.”