This article is sponsored by Alter Domus
The data requirements of real estate investment managers are growing in both scale and complexity, so more and more managers are looking for an outsourced solution which will allow them to focus on core competences and to bring operational efficiencies.
PERE listens in as Michael Gregori and Alex Droste of Alter Domus talk to Peeyush Shukla, chief information and technology officer at Heitman, about his company’s move to outsourced data services and the benefits of this approach.
Alex Droste: As you were drawing up your new data platform, how did you think about Heitman’s need for advanced data concepts knowing that you also had to account for working with a third party administrator on a lot of the data that you need? Were there any special considerations taken to combine the data-as-a-service (DaaS) approach with the build of your platform?
Peeyush Shukla: Before I joined Heitman, there was a pre-existing relationship with Alter Domus for accounting services and I look at this as a synergistic extension of that relationship. Data is extremely strategic. With the evolution of data and data science tools in the industry, it has become increasingly important, and indeed critical, for firms to tap into the power of data.
As part of this DaaS, both of our teams have put a lot of effort into the creation of a data dictionary, the right kind of standardization and automation of the data feeds, and a secure way of creating and transferring data.
All of those things go a long way in terms of us having the same consistent understanding of what kind of accounting data is really important to drive Heitman’s business. Emerging from this will be many other advanced uses of data, but it is really important to have a strong foundation.
AD: Did you have any concerns about the systems not being those which you govern yourself, or concerns about possible incompatibility? One of our major endeavors is to make sure that the data processing during the accounting and operations process is structured and locked down ensuring that we do not go out of bounds with any processing of the data. This can sometimes limit data flexibility.
PS: I suppose there is a natural kind of inhibition firms have when it comes to outsourcing any aspect of data or services, hence the need for there to be a very clear understanding of the processes and taxonomy involved. It is really important to have the right kind of quality checks upstream, because catching any issues with data quality downstream is more costly; the more downstream you are, the more expensive it is to have to replay it back.
One of the advantages of the DaaS approach is that it is agnostic of the underlying system. I also stress the expandability aspect of it.
We have started out on this data journey, with a focus on domestic accounting data sets, but as we look to evolve this, however many systems you have to support the accounting process will obviously not be a hindrance or be a limiting factor. We could have a very fluid evolution path for our data expansion.
Michael Gregori: I think you are likely ahead of the curve, compared with the wider industry, and the thought processes on where they want to be from a data perspective.
I would add that a big part of success in these endeavors is just leaning on the relationship as well, keeping communication lines open, and really getting to that data quality, as you mentioned.
Expectations out of the system
MG: It’s been great to further expand our relationship and continue to support your business growth through this project. We’ve really been able to benefit from the team bonds and connections in order to get this to work efficiently and reach your expectations. So, further to that, what is it that you expect to get out of the system?
PS: Prior to my starting, we were leaning on what I would call on-premises hardware, so we had an opportunity to pivot to a more scalable cloud platform. The vision I have is for our systems to be connected in a very straight-through kind of way.
So, data will actually be coming out of the Alter Domus DaaS model and it would be brought into our global data and analytics platform here at Heitman, first landing in a data lake, which is one of the most robust ways of managing data. A data lake is really a mechanism to ingest a diverse array of data, structured data, unstructured data and semi-structured data.
However, the lake is very agnostic of what kind of data it is. Real estate is extremely data rich, with different formats and different ways of getting it.
The goal is to start to bring in all of these different types of data into our data lake here at Heitman and that is where all of the transformation would occur. Aggregation and enrichment happens on top of what is brought in. Then via our cloud platform we have the ability to provide for different data consumption needs.
Striking a balance between standardizing and tailoring data
AD: When you were going through the definitions of the data that you are aiming to provide to your end consumers, how did you evaluate where that line is between what you would want to define as the Heitman standard of delivery and trying to configure further ways of providing that data?
PS: The use cases for that were really drawn from our asset and portfolio management teams, our research team, and our client service and client reporting department. That allowed us to funnel the right kind of needs, and we took that as a starting point.
This led us to the right kind of spotlighting, the right footprint that was required out of the accounting platform. After all, the platform is meant to support our clients and our internal stakeholders. It took us a lot of effort up front, but it was time very well spent.
Focusing on the ‘right’ automation
MG: As a firm, we have invested a lot in automation and robotics with a dedicated, talented team focused on just that, and it is going to factor into a number of our processes, especially in the DaaS model.
Removing human intervention, reducing error and increasing the speed of deliveries all comes out of that investment. We have added on quite a boost to the quality of service.
What is your experience, and what are those machine learning or artificial intelligence topics that interest you at Heitman? What are the things that you are looking forward to which would support your business in the near future?
PS: It all boils down to good data, good processes and then the right kind of automation. You cannot have automation on top of bad data.
We have been investing to advance our data science capabilities through machine learning, artificial intelligence and robotic process automation (RPA). We have identified a number of processes that are laborious and tedious, but also predictable and repetitive. So, those are key hallmarks of what is an ‘RPA-worthy’ kind of use case, or initiative.
From a machine learning and AI perspective we have plenty of options through our cloud platform to leverage for our portfolio construction and research teams, with the goal of outperforming our benchmarks and creating alpha, in terms of our investment management process.
When it comes to projecting performance, cashflows and distribution of outcomes for our investments, all of those have a very strong connectivity to some of the core capabilities that machine learning offers. That is why it is so important for us to have the right data platform, because it would serve many different purposes, outside of traditional reporting. We want a virtuous cycle of data creation, data learning, data manifestation and data
The importance of synergy
AD: Anybody who is trying to explore this part of the technology world is going to probably be on the same journey that we are, but we believe the most important thing is identifying flexible but standardized ways of collecting the data.
If we are flexible on what data points we bring in, but we are diligent about normalizing those data points, once they get onto our platform they are standardized and robust internally.
As long as you have the data sorted out appropriately, you have a more flexible way of delivering that data. Our main goal is to continue working on identifying ways to bring the data onto our platform in the most efficient way possible.
This may mean looking for trends in the data coming in and trying to find out if we can use machine learning to aid us in coding that data, as it comes onto the platform. However, the key is making sure that the base data is strong.
PS: Yes, the fundamentals have to be strong, because any derivatives of those fundamentals are directly correlated to what you start out with. I would add that the X factor in all of this is the synergies between what we do and what Alter Domus does.
I am a big fan of this mathematical equation: one plus one plus synergy equals 111. It is the synergy and the shared strategic view which gives us the operational alpha.