RealPage on finding value through data quality

A combination of people, process and technology is key to ensuring high-quality data leads to optimized real estate investment decisions, argues RealPage’s Mark Trocolar.

This article is sponsored by RealPage

There is data available on almost anything these days, but reliability is critical. To leverage data insights, real estate managers and investors not only need to collect and track important data, but also ensure data quality and on-demand availability.

Both of these processes are crucial to making investment decisions that provide not only the most benefit to managers but also the best return to their investors, says Mark Trocolar, vice-president at RealPage Investment Management, a global provider of software and data analytics to the real estate industry.

What is the right data for real estate managers to track and rely on to make optimal investment decisions?

Mark Trocolar

Whether you are an asset manager, portfolio manager or investor, you should capture and track data over time that gives you a clear picture of a real estate investment, allows you to manage its associated risk, and enables you to develop insights and drive outperformance going forward.

To have a full picture, you should capture investment attributes, financial and operational data. Regarding investment attributes, data capture efforts should focus on data points that allow you to properly classify the investment and manage risk and exposure, for instance: location, size, class, or lifecycle, as well as asset type-specific data points such as clear height and column spacing for industrial properties.

Regarding financial data, real estate managers and investors should capture actuals, budgets and forecasts, so that you understand where the asset has been, where it is now and where it will be in the near to mid-term. Converting financials to a single chart of accounts, isolating cashflows at multiple investment levels, having transparency to specific transaction types and the ability to produce key returns are important considerations. This will enable you to do a tremendous amount of analysis and provide the basis for executing scenario modeling to forecast future performance.

Lastly, but not least importantly, is operational data, which is the context behind and basis of the financial data. Actively curating the operational data allows managers to incorporate operational strategy and planning into their analyses and forecasting, enabling them to use it to drive the outperformance of the investment.

Data on tenants, leases, receivables, key dates and operating metrics is extremely important to enable managers and investors to understand investment health and drivers of performance, manage exposure-related risk and develop operational planning to proactively address it.

How can real estate asset managers make sure they can trust the data?

Trust in your data is the key to enabling data-driven decision making. If your organization does not trust the data it has available, it will not make decisions based upon the analyses performed using that data.

Eventually, this will lead to the proliferation of personal spreadsheets spread across the organization that contain versions of the same or similar data. There will be no consistency, and it will be very difficult to manage risk or make optimal investment decisions.

When some organizations decide they want to capture and utilize investment data, they will take data from their property managers, joint venture partners, development companies and appraisers, aggregate it and immediately try to use it on reports and dashboards.

Unfortunately, the reality of working with data is that it is not of a high quality by default, nor does it consistently agree when provided by disparate sources. Data needs to be validated and aggregated in a thoughtful way that provides a solid, common foundation for usage, encourages easy access and affords transparency.

What kind of access and flexibility around data reporting and analytics do investors need?

Allowing your managers and investors a variety of ways to consume data is very important. A data consumer’s perspective or role in an organization will help determine that preferred method.

Some consumers will want all their important data and metrics organized and presented on a single summarized dashboard. As they use that dashboard to monitor the latest information, they will want to drill down on metrics to uncover the data impacting their values or determine why data is trending in one direction or another. Others may want to access data via their mobile or even have it sent to them, so they aren’t required to log into an application.

Additionally, some consumers may want very detailed data or the ability to generate ad-hoc queries to answer specific questions while others may want to extract large amounts of raw data to perform modeling and data analysis tasks. Because of these varying needs, flexibility to view, drill down, roll up and extract data through a variety of methods is very important.

The data made available for analysis and reporting across the organization must be both correct and accurate. To achieve this, organizations must implement a repeatable, scalable and efficient best practice approach to data quality that combines people, process and technology. All three of these pillars working together can effectively ensure success and high levels of data quality.

At a high level, each organization needs to implement a data management and governance framework. Many organizations are reluctant to do this because it sounds difficult, time-consuming and expensive, but it doesn’t have to be. The most important aspect of a framework is an organized approach to data quality across the organization.

The application of technology is the central focus of a scalable and efficient data quality framework. When utilized appropriately, technology enables data correctness, data availability, facilitates communication and the automation of manual processes.

Technology cannot solve all your data quality problems, but employing systematic data validations is the key to scalability and consistency. Data should be systematically analyzed before being made available for key reporting and analyses to ensure it meets organizational data quality standards. This can be executed for all the data an organization should be tracking – attributes, financial and operating data.

The application of systematic rule sets can identify suspect data, data changes and values exceeding thresholds so that a knowledgeable person can determine its correctness. This is critical to ensure scalability and focus manager attention where required.

Process is the glue that joins technology and people. Throughout the data quality framework, standard and repeatable processes enable consistency and efficiency and bring technology and people together to achieve goals. For example, automating manual workflows is a terrific way to marry people, technology and process to achieve consistency, efficiency, transparency and auditability.

Once investors know they can trust the data, how do they make sure they have access to it, when and how they need it?

Once you ensure the quality of your data is high, it should be available for reporting. The aggregated data set should promote transparency and availability. A best practice approach dictates it should be aggregated in a single location where it can be compared, consolidated, analyzed and aligned with additional data sets to generate insights so that data can become information.

An aggregated data set is important as it allows consistency so consumers asking the same question obtain the same answer. By driving consumers to the aggregated and validated data set, you are ensuring analyses start from vetted, foundational data.

The aggregated data set must also be available to consumers as quickly as practicable. Data consumers want the ability to quickly access data in ways that make it easy for them to use. Additionally, a variety of ways to access and consume the data is preferred – such as dashboards, standardized reports, ad-hoc queries, Excel add-ins, email or auto-notifications.

What kinds of decisions can be made with the right data available on demand?

When an organization has relevant, high-quality data available on-demand to all constituents, that data can drive a wide variety of impactful decisions. A thorough understanding of an asset, including leases, vacancy and market/demographic data, can identify assets ripe for innovative asset management strategies.

A good example is ground floor retail repurposing by employing window billboards in vacant spaces to realize alternative cashflow streams. Risk management or mitigation strategies can be driven by identifying tenant, industry, geographic and lease rollover risks.

The ability to compare assets not only across your portfolio, but also to benchmarks, comparative investments and market data will provide valuable insights into asset performance. By incorporating various types of data into your analyses, such as investment attributes, financial and market data, both positive and negative correlations and drivers of performance can be established.

Additionally, with the financial data we spoke about earlier and the right tool set, scenario-based forecasting at an asset and portfolio level is possible. Organizations can perform sensitivity analysis, executing best, worst, or most likely case scenarios.

Understanding how resilient your assets and portfolio will be to specific market conditions will uncover the assets to be held, sold and acquired in the future. By turning data into information, organizations are equipped to make optimal investment decisions.

What could go wrong if you don’t have data you can rely on?

If an organization does not have readily available, high-quality data, both managers and investors will look for it elsewhere. As they acquire data, they will capture it in their own personal data stores. Immediately, consistency will be lost, data will become siloed, and aggregation across the organization will be very difficult.

Pockets of people across the organization will capture and track data they believe is important and then make decisions based on that fragmented data – as opposed to a holistic approach with a single source of truth that an organization can access in order to get the data to perform analyses. This way, you know that everyone is using a solid, vetted data set to make decisions.

A worst-case scenario is someone relying on data that is incomplete, inconsistent or flawed to make a decision which will impact current and future returns.

Leases with suboptimal terms will be signed impacting asset valuations. Properties may be sold too early and returns not maximized. Underwriting analyses will be inconsistent, leading to poor acquisitions with suboptimal returns or increased risk. Your decision-making is only as good as the data that you base it upon.