In Construction Dive's ConTech Conversations monthly column, we talk with industry leaders to learn about their companies’ tech adoption and innovation and what the future holds. Do you know a contech expert we should interview? Email [email protected].
Jit Kee Chin joined Suffolk Construction almost three years ago after nearly a decade of experience with data analytics advising at McKinsey & Co. Now Suffolk's executive vice president and chief data and innovation officer, Chin is using her experience to help find the best ways to leverage the contractor's data and implement easy-to-use data systems and practices that improve the flow and use of data.
Editor’s note: The following interview has been edited for brevity and clarity.
CONSTRUCTION DIVE: You have a background and experience with data collection. What are the best practices for collecting all of the information coming from a construction jobsite? Are contractors doing that properly?
JIT KEE CHIN: I think it is a common perception that people think they collect a lot of information but don't use it well. I'd say that that's true up to a point. I think we can make better use of the knowledge we collect, but construction actually doesn't have large volumes of information. What it does have is very heterogeneous information. It has a lot of information across contracts, across text documents, across drawings and across financial information.
So, the challenge in construction data is heterogeneity in terms of the data that we historically collect.
The other challenge is how clean that data is. There are good data management practices in place, which historically I don't think we've had.
The secondary trend that I see is more of what I would call big data capture applications. When [contractors use] image capture, video capture, sensors, internet of things-type devices, and wearable devices on and off the site, there's a new stream of information coming which is typically handled through point solutions at the moment and not in any way with a company's [enterprise resource planning].
I think the best practice is, first of all, good data management practices. And that extends from being very clear about your master data and what you need to identify so that you can correlate information from different applications of different systems. The concept of "master data" is simply things like unique identifiers across your project so that you can coordinate your schedules to your financials to your safety observations, etc.
I think the landscape right now has a lot of different systems and they don't necessarily talk to each other. We have to overcome that by having good data management practices.
A second best practice would be a good process adherence. Depending on the maturity of the company, there may or may not be good process discipline. One of the things that we have found that made a big difference is defining how we are entering information, where we should be entering information, and what the controls are that we need to put in place to ensure that the information we collect is clean and what we expect it to be.
Once you have the data management and the process discipline in place, integration across systems is key.
The last thing is to make that data transparent so that it can be useful. We have had a lot of success with near real-time dashboards that give people an overview of how a project is progressing on a daily basis, where it's automatically fed from the system. It doesn't require someone to pull the report together because they're all fed from a common data source, but they also have the same information visible to the frontline of project management.
That's been very helpful because you make data available to people on a much faster cycle time than the typical monthly reporting cycle.You've enabled them to therefore take actions to mitigate any concerns ahead of time. Over time our goal is to become much more predictive and act on leading indicators and be able to mitigate effects before they become a problem.
When using collected data, are you mostly looking at the information to solve a problem or looking to find problems you can solve?
CHIN: Broadly speaking, I'm using it to solve a problem. The problem can come from noticing abnormalities in the data but more often, it's a business problem.
For example, if it's well accepted that projects typically run over time and over budget, then you can unpack that and ask why. What are some of the leading indicators that we are about to go over time or over budget? Can we do anything upfront to actually avoid that situation? Or is it because of an unforeseen change? We know from our portfolio of projects and from our experience that this is more likely to happen.
I also think ... data supports decision-making. There are a lot of decisions that you have to make on day-to-day operations. A project manager may have 10 to 20 change orders at any given time.There are lots of these types of decisions that I think are taken day to day for the job where additional data can help you make better decisions. That's when we see the value.
What is something about collecting data or solving problems with data in construction that has surprised you?
CHIN: I think I was surprised by how hard it is to embed some of these tools in the workflow. We are a very physical industry. We are not like other primarily desk-bound companies where it is easier to adopt the digital tools. So, when we think about how to embed some of these new tools that we've developed into the daily workflow, we have to be very cognizant around how much time [employees] are actually out working in the field. We can be very thoughtful about what requires them to sit in front of the computer and enter data or read the data versus what their main job is, which is out in the field, managing the builds.
From an outsider's perspective, what is a tech or data practice in construction that surprised you?
CHIN: I'll share two things I noticed within my first month of joining. One is that it was incredible to me that nowadays all architects basically design in digital formats. Then contractors will often also do BIM modeling in order to build in more detail to check constructability.
However, the interface between the two is a 2-D printed sheet of drawings and that's the contractual drawings. I find that astonishing because just from a data perspective, the amount of information lost when you take something that was in the virtual environment and print it out, that feels to me to be a very inefficient process.
I think the other thing that's surprising is that in construction, because there are so many changes in the course of the project, a very natural feedback loop from an information perspective that happens in many other industries tends not to be closed.
An example would be estimating costs. Conceptually, it's very simple to say you can estimate it upfront, but what you really want to do is when you finish the building is to check what [the cost has] turned out to be and then use that to adjust your estimate. People kind of intuitively do that, but an automatic mechanism doesn't exist because typically there'll be so many changes throwing the build that you can't really reconcile the bill back to the first plan.
Similar for the digital twin; you may have built a digital twin upfront, but due to design changes and whatnot, down the road, you don't close that loop again.
What are the best practices for adopting a new data system into a rigid traditional construction process?
CHIN: I always talk with business leadership for adoption. Success factors for adoption happen way before the product is ready to be rolled out. It actually happens at the conception of the idea and the design of the idea itself.
What's critical is the partnership with the business leader. It's important to partner in a way where they feel real ownership around the solution itself and they see the value because at the end of the day, the value capture comes from the frontline. And if they are the business leader that owns that frontline, then they need to be bought into it.
So enrolling the right business leadership, building those relationships up front and then ruling them in the idea is absolutely critical.
Once we have that business leadership, I would say making sure that something's really simple and intuitive to use is also critical. People are very busy on the jobsites and they're so used to slick consumer applications nowadays that unless you have an enterprise application that's also equally user-focused you run into adoption challenges.
I think that it's important when you come to adoption and ability to actually embed it in the line, the two things that matter are training as well as integration into workflow. [You have to be] very thoughtful about [questions like] 'how often are they expected to use it,' 'why should they use it,' and 'how does it change what people are doing day to day?'
How do you fully vet new technology before implementing it?
CHIN: We have a pretty robust process for piloting, both with external solutions as well as internal solutions. The process starts with enrolling the users in the design upfront so that you have that subject matter expertise input. Once you have the design, then there is the technical testing, putting it into a development environment or having people beta test. Alpha testers go in and try to break it, measuring things like load time, lag time, accuracy rates, all the technical specs.
We [also] have superuser groups use it and try to break it as well and give feedback. Then we roll it up to a broader group of more regular users, but it's still in beta form. That’s when we see what the reception and the input is. That's often the most interesting phase because you start moving away from just a technical implementation.
The business then starts using it and saying, where is it most useful? Where is it less useful? Then we move on to a full-scale rollout end to end. That process can take a year or more depending on the complexity of the solution.
What will the ideal jobsite data process look like in five to 10 years?
CHIN: I'd like to see data advance to a point where there would be [what we could] call a “data war room” on a site itself where you can see how the site operates. You analyze on a daily or maybe weekly basis where the job is at and therefore what needs to be done. And then be as forward looking as possible and figure out what needs to be done to actually accelerate or to mitigate risks.