Sunday evening is normally when many of us start stressing about the much dreaded Monday morning. On one particular Sunday evening, a property manager at one of our client buildings forwarded us an email that he received from the Natural Gas provider for one of the buildings in his portfolio. This was an automated email stating that for one particular building, the Natural Gas flow for the month of July was above the Natural Gas â€˜baselineâ€™ by 20%. Since our client is committed to reducing the carbon footprint and energy consumption of all the buildings he manages, the suggestion of an increase in gas consumption by 20% really set off the alarm bells. Obviously concerned about the operation of his building, our client asked Dimax what immediate measures should be taken at the building to solve the problem.
Our first step was to quickly assess the validity of the email since deploying anything at the building level would cost the client money and might not yield a result that is in the best interest of the building owner. To assess validity, we first reviewed the operational data of the buildingâ€™s gas-consuming equipment for the past 60 days, which our servers had been collecting in real time and storing (in 5-minute intervals) for the past several years. This review showed that the operation of the gas consuming equipment over the past 60 days wasn’t any different than it had been over the 12 months prior. Since the operational data suggested that the excessive gas consumption wasnâ€™t actually real, our next step was to obtain the monthly gas meter readings from the past two years from the utility company directly. After analyzing the data we obtained from them, we found that the building had actually decreased its gas consumption by 9% in July 2015 relative to July 2014 on a weather corrected basis. With this information, we suggested to the client that he disregard the automated email pending further investigation into what caused it to be triggered.
In the end, the cause of the automated email ended up being related to how it was set up in the providerâ€™s servers. What we found in the gas data was that the utility company had taken two partial-month gas readings in the month of July 2014 and the total of the two represented that whole monthâ€™s consumption, whereas in July 2015 there was only one reading for the entire month. The servers that generated the automated email compared the July 2015 gas reading to only ONE of the two July 2014 gas readings instead of summing the two readings, which triggered the notification email to be sent. If Dimax hadnâ€™t analyzed the data related to the content of the automated email, our client would have ended up spending money unnecessarily to send service providers to site check equipment that was working fine all along. Furthermore, whenever service providers are deployed to site to solve a problem that doesnâ€™t actually exist, the building owner incurs cost and is at risk of having equipment settings changed unnecessarily as the service provider tries to do the best job possible (which oftentimes makes the building perform worse than it did before).
This story is a good example of why data cannot always be interpreted as information. The automated email stating that the Natural Gas flow for the month of July 2015 was above the Natural Gas â€˜baselineâ€™ by 20% contains what appears to be information, but the email was sent simply because of data. To translate data into actionable information requires analysis, and that analysis must be in the context of the building for it to be of the highest value to the owner. The intersection between a buildingâ€™s end use, its HVAC systems, and its ownerâ€™s business objectives vary immensely and are constantly evolving. Although it would be extremely difficult to program sufficient context and dynamic business objectives, with progress in the field of advanced analytics it is only a matter of time when we witness truly effective autonomous analyses.
Follow us on Twitter: @DimaxOfficial and on LinkedIn