Next Article in Journal
Challenges and Opportunities in Rock Mechanics and Engineering—An Overview
Next Article in Special Issue
Special Issue “Industry and Tertiary Sectors towards Clean Energy Transition”
Previous Article in Journal
Research on Steering Vibration Analysis of Wheel Loader and Cushion Valve Design
Previous Article in Special Issue
Energy Performance of Italian Oil Refineries Based on Mandatory Energy Audits
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Private Hospital Energy Performance Benchmarking Using Energy Audit Data: An Italian Case Study

1
Department of Enterprise Engineering, University of Rome Tor Vergata, 00133 Rome, Italy
2
DEIM School of Engineering, University of Tuscia, 01100 Viterbo, Italy
3
DUEE-SPS-ESE Laboratory, Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA), Lungotevere Thaon di Revel, 76, 00196 Rome, Italy
*
Author to whom correspondence should be addressed.
Submission received: 24 November 2021 / Revised: 14 January 2022 / Accepted: 18 January 2022 / Published: 22 January 2022
(This article belongs to the Special Issue Industry and Tertiary Sectors towards Clean Energy Transition)

Abstract

:
The increased focus on energy efficiency, both at the national and international levels, has fostered the diffusion and development of specific energy consumption benchmarks for most relevant economic sectors. In this context, energy-intensive facilities, such as hospitals and health structures, represent a unique case. Indeed, despite the high energy consumption of these structures, scientific literature lacks the presence of adequate energy performance benchmarks, especially in regard to the European context. Thus, this study aimed at defining energy benchmark indicators for the Italian private healthcare sector using data collected from the Italian mandatory energy audits according to Art.8 EU Directive 27/2012. The benchmark indicators’ definition was made using a methodology proposed by the Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA). This methodology provided the calculation of specific energy performance indicators (EnPIs) by considering the global energy consumption of the different sites and the sector’s relevant variables. The results obtained were compared with those obtained from a consolidated but more complex methodology: the one envisaged by the Environmental Protection Agency. The results obtained allowed us to validate the reliability of the proposed methodology, as well as the validity and future usability of the calculated indicators. Relying on a significant database containing actual data from recent energy audits, this study was thus able to provide an up-to-date and reliable benchmark for the private healthcare sector.

1. Introduction

In Italy, about one-third of the total energy use is attributable to the building sector. In this sense, buildings destined for hospital use are particularly significant as they are highly energy-intensive structures in addition to their social role. The average consumption in hospitals is three times higher than in the residential sector in similar climatic conditions [1]. Although these structures are intense energy users, their energy analysis and characterization have not been sufficiently investigated. Indeed, energy efficiency was not considered as one of the sector’s main objectives compared with requirements such as quality of services, functionality, or patients’ well-being.
Our purpose was to carry out an important first step for the energy efficiency of this relevant sector through the definition of energy performance benchmark indicators.
To achieve this objective, a large dataset that came from mandatory energy audits for several structures operating in the Italian private health sector and collected by the Italian National Agency for New Technologies, Energy and Sustainable Economic Development (ENEA) was used.

1.1. Energy Consumption in Hospitals

A hospital structure has several peculiarities from the point of view of energy consumption. Hospitals must ensure services 24 h a day, seven days a week, throughout the year. In addition to this, the structures themselves must comply with a series of constraints imposed by the regulations to ensure a high comfort level and healthiness of the environments. Despite their high complexity, hospitals have the potential to reduce consumption through the implementation of investments and interventions aimed at improving the energy efficiency of structures and systems and constraining energy waste.
In general, hospitals’ energy needs consist of using electricity and heat. Electricity is used to power medical, diagnostic, and monitoring equipment, indoor and outdoor lighting, summer air conditioning, air treatment, and the operation of computerized and security systems. Thermal energy is mainly used for the heating and air conditioning of rooms, sanitary water production, sterilization, and laundry and kitchen services. In turn, the uses of electricity and heat can be classified into two categories. The first refers to hotel-type uses to guarantee the well-being of healthcare workers and patients, including indoor and outdoor lighting, summer and winter air conditioning, lifts, the preparation of domestic hot water, and laundry and kitchen activities. The second refers to the uses for surgery, treatment, and diagnosis devices, i.e., diagnostic-medical equipment and instruments for sterilization [2].
Thermal energy is the one that best lends itself to rationing interventions since, in addition to having a high impact on total energy consumption, it is mainly used for space-heating purposes. This use allows for temporary interruptions for implementing the intervention itself without compromising the well-being of the people present in the hospital. The rationing of interventions is also possible for electricity, but it is necessary to consider that significant interruptions are not allowed, as electricity is used in services of primary importance that require continuity in their supply [3].

1.2. Energy Benchmarking

Over the last few years, several studies have focused on the analysis of the energy performance of health facilities and hospitals [4,5,6,7,8] for different countries such as Germany [9], China [10], the United States [11], and Korea [12].
On the other hand, other studies were not limited to an energy analysis but were aimed at defining specific benchmarks for different countries under different operating conditions, such as differences in management and, above all, environmental conditions.
In this regard, the UNI CEI EN 16231: 2012 standard [13], entitled “Energy efficiency benchmarking methodology”, emphasizes the importance of determining the reference indices to compare performance. This comparison can be internal to the organization, through the analysis of historical data, and external, through comparing the organization’s performance with those of other organizations in the sector. Through this energy comparison, the company can become aware of its performance and invest, if necessary, in improvement programs in terms of energy efficiency.
Different benchmarking approaches were developed for the specific health sector [14,15,16,17] and at a more general level in buildings [18,19,20,21,22].
These works are based on different approaches ranging from the definition of energy performance indicators (EnPIs) with identification of the relevant variables [12,15,17,18,20,21] to others based on statistical linear regression models mainly using the methodology proposed by the Environmental Protection Agency (EPA), which is described in detail in the following chapters [14,19,22,23,24].
Although there are approaches aimed at studying the energy behavior of health buildings and attempts at benchmark definitions, we found a lack of references in the scientific literature, especially regarding the Italian or, more generally, European context. These approaches, previously mentioned, do not translate into the definition of reliable and updated benchmarks that a sector structure can use as a reference. Instead, the benchmark approach is more developed in other countries, such as the United States. However, since the structure and energy behavior are very different, they are not considered applicable to the European context.
This study aimed to define energy benchmark indicators for the Italian private health sector using a simple approach based on the calculations EnPIs following a methodology proposed by the ENEA, which has been used successfully in other contexts [25].
One of the strengths of this work is the possibility to rely on a significant database containing actual data from recent energy audits, which allowed us to obtain up-to-date and reliable results that were perfectly suited to the Italian context to which we wanted to refer.
Moreover, to discuss and validate the results obtained, these were compared with the results achieved using a consolidated methodology that required greater complexity, such as the one proposed by the EPA.
This paper is structured as follows: Section 2, i.e., Materials and Methods, describes the dataset, including the activities of data collection and preprocessing, and introduces the main step of the ENEA methodology used to determine the benchmark EnPIs and the main steps of EPA methodology used to validate the result obtained. Section 3 describes the results obtained applying the two methodologies, while Section 4 discusses the main issues encountered and compares the result obtained with the ENEA methodologies with those obtained using the methodology proposed by EPA. Finally, in Section 5, the objectives, significant results obtained, and the next steps of the research are discussed.

2. Materials and Methods

This section describes the approach used: from the methods applied to preprocess the data to obtain the final dataset to the description of the main steps of the methodological approach proposed by ENEA and the one developed by EPA, which was used to compare the results obtained.

2.1. Data Collection and Preprocessing

Directive 2012/27/EU [25] establishes that “Member States shall ensure that enterprises that are not SMEs are subject to an energy audit carried out in an independent and cost-effective manner by qualified and/or accredited experts or implemented and supervised by independent authorities under national legislation by 5 December 2015 and at least every four years from the date of the previous energy audit.” For Italy, the energy audits are collected every four years by ENEA.
For the purposes of this work, the energy audits for the Italian private health sector, received by ENEA in 2019 in correspondence with the second cycle of energy audits, were analyzed in order to define benchmarks suited to the Italian context.
In order to report relevant information about their energy consumption, each organization was required to submit a summary spreadsheet with every energy audit report. Taking into account the lessons learned during the first cycle of energy audits in 2015, ENEA decided to create a summary spreadsheet to use specifically for hospitals and health facilities in order to enable the collection of more detailed information about the energy consumption of the structure.
In 2019, in reference to the NACE Q86 code (Human health activities), the number of health facilities potentially subjected to the obligation to carry out the energy audit was 328. However, for feasibility reasons multi-site health companies were allowed to carry out energy audits on a limited number of representative sites using a clustering strategy developed by ENEA. Therefore, 152 energy audits were actually received by ENEA, with a high percentage (145 audits, 95.4%) belonging to NACE code 86.1 (Hospital activities), which is why it was the only one to be considered.
Referring to the Italian economic activity classification ATECO (ATtività ECO-nomica), revised in 2007 and deriving from the European classification NACE, Table 1 reports the descriptions of the subcategories of the ATECO code 86.1 and the number of audits for each category.
A first analysis of the data collected showed that a minority of organizations did not use the updated summary spreadsheet that was implemented for the health sector, but a general one belonging to the tertiary sector. Since relevant information was absent in this other type of summary spreadsheet, to conduct complete and more in-depth analyses, the sample was reduced to only the organizations that used the updated summary spreadsheet, i.e., 85. However, further analysis showed that some information collected in the files was incongruent or incomplete. Thus, as a result, a final database consisting of 58 energy audits was obtained and analyzed.
For each healthcare structure, the following information was available:
  • Data of the site, or the identification of the same, the name, the city, the VAT number, the NACE code of belonging and the accreditation or not to the National Health Service (NHS);
  • General details of the structure, i.e., the covered area, the health workers, the beds, and the presence or absence of the swimming pool;
  • Overall consumption of electricity, heating, and cooling relating to each site;
  • Consumption and data relating to two macro-areas into which it is possible to divide a hospital structure, a part for hospitalizations, and a part for diagnosis and therapy.
Figure 1 shows some of the characteristics of the final sample analyzed in terms of the sites, beds, and health workers divided by the ATECO code and in terms of accreditation to the NHS.
To complete the available data, for each structure in the database, the degree days of heating and degree days of cooling were calculated through the website Degree Days [26]. In particular, the reference temperature, based on which, the heating and cooling degree days were calculated, was set to 22 °C and 25 °C, respectively, taking into account the minimum requirements that a healthcare facility must comply with.

2.2. Data Analysis

2.2.1. ENEA Methodology

The procedure proposed by ENEA for determining the benchmark energy performance indicators (EnPIbmk) consists of a series of steps [27]:
  • Identification of the relevant variables;
  • Calculation of the energy performance indicators (EnPI) for each site;
  • Calculation of the average energy performance indicators (EnPIavg);
  • Definition of the EnPIbmk;
  • Evaluation of the reliability of the EnPIbmk.
The first step of the methodology involves the identification of the relevant variables, which are those quantifiable factors that significantly impact energy performance and routinely change (weather conditions, operating conditions, working hours, production output, etc.) [28]. The identification of these variables is usually determined by the knowledge of the energy system under analysis and is supported by the reference scientific literature. The second step involves the calculation of the energy performance indicator (EnPI) for each site in the sample considered, which is defined as the ratio between energy consumption and the representative consumption parameter (relevant variable):
EnPI   [ tep m 2 , bed , etc . ] = energy   consumption   [ tep ] parameter   [ m 2 , bed , etc . ]
Subsequently, the average energy performance indicators (EnPIavg) are calculated, which are defined as the average of the EnPIs of the individual structures and the relative standard deviation (st.dev.), which expresses the dispersion of the data of the sample considered around the average. Therefore, the benchmark energy performance indicators are determined using the following formula:
EnPI bmk =   EnPI avg ± st . dev .
Based on the ratio value between the standard deviation and the EnPIavg, it is possible to evaluate the reliability of the EnPIbmk. Reliability is considered as follows:
  • “High” if the ratio is less than 20%;
  • “Average” if the ratio is between 20% and 60%;
  • “Low” if the ratio is greater than 60%.
Figure 2 reports a schematic representation of the methodology followed.

2.2.2. EPA Methodology

The EPA has developed a technical methodology for evaluating the energy performance of different types of buildings; in this study, reference was made to the specific one developed for hospitals [29]. This methodology consists of a mathematical model for the definition of the energy efficiency ratio (ERR). The purpose of the methodology is to identify, through regression analysis, the key factors that determine energy consumption in order to develop a consumption forecasting model that allows for evaluating the energy performance of a hospital or, in more general terms, for a building. The procedure is divided into a sequence of phases, which have been adapted according to the information contained in the energy audits under study.
The first phase involves defining a group of structures with similar functional and operational characteristics to compare the structures themselves and overcome any technical limitations in the data. Then, it is necessary to define the variables for the regression analysis. Regarding the dependent variable, this is represented by the energy use intensity (EUI), which is equal to the total energy consumption of the site (EC) divided by the site’s surface area. The independent variables, on the other hand, refer to those factors that characterize the health facility and that can impact energy consumption (X1—health workers per square meter, X2—beds per square meter, X3—cooling degree days, X4—heating degree days, and X5—machines per square meter). Therefore, the predicted EUI is calculated as follows, with a0, a1, a2, a3, a4, and a5 as the parameters of the linear regression [29]:
Predicted   EUI = a 0 + a 1 X 1 + a 2 X 2 + a 3 X 3 + a 4 X 4 + a 5 X 5
After determining the regression model for forecasting the energy use intensity, the methodology defines the energy efficiency ratio (EER) for each site as:
EER = Actual   EUI   [ tep m 2 . ] Predicted   EUI   [ tep m 2 . ]
The numerator represents the energy consumption intensity for the specific health facility, which is calculated using measured data. In contrast, the denominator represents the expected value of the energy consumption intensity, which is calculated through the previously determined regression model using the measured values of the independent variables (X1, X2, X3, X4, X5) for the same site as inputs. Thus, a low energy efficiency ratio indicates that the specific health facility is more efficient than the average because it uses less energy than predicted, whereas a high energy efficiency ratio indicates the opposite.
After computing the EER for each element of the sample, the results can be analyzed through a frequency distribution to highlight the differences in the energy efficiency of the sample.
Finally, by sorting the values of the EER from smallest to largest, it is possible to calculate the cumulative distribution of the EER for the sample and use regression analysis to obtain the value of the cumulative percentage as a function of the energy efficiency ratio.
In conclusion, through its mathematical formulation, the model created makes it possible to compare the energy performance of a generic health facility with those of the sample used.

3. Results

3.1. ENEA Results

We used the database defined in the previous paragraph to calculate the benchmark energy performance indicators for the private health sector. Specifically, the energy performance indicators were calculated using the energy consumption as a numerator given by the sum of the health facility’s electricity, heating, and cooling energy consumptions. The denominator, instead, changed for each energy performance indicator (as shown in Table 2), using the relevant variables available in the database.
The energy performance indicators were defined for the ATECO 86.10.10 (general hospitals and nursing homes) and 86.10.20 (specialist hospitals and nursing homes) codes. The ATECO 86.10.30 code (institutes, clinics, and university polyclinics) was not analyzed, as it was not significant in terms of the sample size. Moreover, the analysis was also conducted specifically for the hospitals accredited and not accredited to the NHS. Additional indicators were assessed considering a more specific part of the data available, namely, that relating to hospitalizations and diagnosis and therapy, using only the sites that had filled in the relevant fields provided within the summary file. To limit the possible distortions of energy consumption, we decided to exclude sites with a swimming pool from the sample in the analyses explained above.

3.1.1. Energy Performance Indicators: Generality of the Structure

The benchmark energy performance indicators (EnPIbmk) were defined by relating the energy consumption to three relevant variables:
  • The covered area (ca) in square meters;
  • The number of health workers (hw);
  • The number of beds (b).
These variables were shown to significantly impact the energy consumption for hospitals in several studies [11,15,17,18].
Table 2 shows the results of the EnPIs calculations.
For the categories identified, the indicator referring to workers always had average reliability. Good results were also obtained considering the covered area, while the worst results were obtained considering the number of beds as the relevant variable.
The same benchmark indicators were also identified only for health facilities accredited to the NHS, improving the reliability of some indicators compared with those defined considering the whole dataset.
For example, Figure 3 shows a graphical representation of one of the calculated EnPIs using the area covered in square meters as the relevant variable, showing good reliability for the ATECO 86.10.10 code.
In order to be able to differentiate the structures and conduct a more targeted analysis, a further structures subdivision was envisaged during the energy audit phase. Each structure was divided into two macro-areas: hospitalization and diagnosis and therapy. Each of these could be divided into several parts, where the results of the related analyses are given in the following paragraphs.

3.1.2. Energy Performance Indicators: Hospitalizations

The hospitalization macro area represented the hotel area of the health facility. We could divide the hospitalization into five specific hospital wards: overall areas of hospitalization, intensive care, day surgery, dialysis, and gyms and rehabilitation. During the energy audit, for each of the areas present within the health facility, it was possible to indicate the consumption of electricity, heating, and cooling; the number of days in the hospital; and the covered area of the relative spaces. This information was used to determine more specific EnpIs, which was useful for comparing similar structures in terms of wards.
Starting from the database and excluding the sites belonging to the ATECO 86.10.30 code and those with a swimming pool, the number of sites that provided the data requested for at least one area among the five previously listed was 24. However, these sites were different from each other in terms of the areas present within them. In the definition of the benchmark indices, this heterogeneity involved the need to consider a subset of health structures characterized in terms of the presence of the areas under analysis from time to time.
For each area, two energy performance indicators were defined. The first related the sum of the electricity consumption, heating, and cooling of the single area to the relative number of days in hospital (dh), while the second one related the sum of electricity consumption, heating, and cooling of the single area to the relative surface area (sh). Following the ENEA methodology steps defined in the previous paragraphs, it was possible to calculate the benchmark indicators and their relative reliability. Table 3 summarizes the results of the reliability evaluation for the EnPIs related to hospitalizations.
The calculated benchmark indicators related to the surface area showed average reliability only for the overall areas of hospitalization and for the day surgery ward, while for the remaining areas, we did not find valid benchmark indicators due to the “low” reliability, both concerning the number of days hospitalization and the surface area. For the dialysis ward, it was not possible to calculate the respective indicators due to an excessively small sample.

3.1.3. Energy Performance Indicators: Diagnosis and Therapy

The diagnosis and therapy macro area represented the operating area of the health facility. We could divide the diagnosis and therapy into seven specific activities: operating block, sterilization, radiology and diagnostic imaging, first aid, functional and endoscopic examinations, transfusion center, and laboratory diagnostics. For each of the services provided by the health facility, among the information contained in the collected energy audits, it was possible to find the consumption of electricity, heating, and cooling; the number of services provided; and the surface areas of the spaces where the services themselves are provided.
Starting from the database defined in Section 2.1 and excluding the sites belonging to the ATECO 86.10.30 code and those with a swimming pool, the number of sites that provided the data requested for at least one of the seven activities listed was 30. However, they did not all perform the same diagnosis and therapy activities; consequently, in developing the benchmark EnPIs for each type of service provided, a subset of health facilities carrying was considered. For each activity, two energy performance indicators were defined: the first relates the sum of the electricity, thermal, and cooling energy consumption of the single activity to the relative number of services provided (ns), while the second relates the sum of the consumption of electricity, heating, and cooling of the single activity to the relative surface area (ss) where it is carried out.
Table 4 summarizes the results of the reliability evaluation for the EnPIs related to diagnosis and therapy.
All benchmark indicators calculated for dialysis showed low reliability, both for the number of services provided and the covered surface area. These results were mainly due to the high heterogeneity of the services provided within the same specific activity.

3.2. EPA Results

Using the same starting database and following the EPA methodology described in the previous paragraphs, the first step was to define a sample of health facilities that was as homogeneous as possible. This resulted in the exclusion of 20 sites from the 58 sites initially present in the database to provide a final sample of 38 health facilities. In particular, the sites excluded were as follows:
  • Those belonging to the ATECO code 86.10.30;
  • Those with a swimming pool inside.
The dependent variable of the regression model was represented by the intensity of energy consumption (toe/m2), which is equal to the ratio between the sum of the electrical, thermal, and cooling energy consumed and the covered area. For the choice of the independent variables, the data relating to both the generality of the health facility and the climatic conditions were considered, namely, the covered area, number of health workers, number of beds, heating degree days, and cooling degree days. In particular, the health workers and the beds were considered in terms of the surface density, comparing the respective values to that of the covered area. Therefore, the independent variables were as follows:
  • Health workers per square meter (employee/m2);
  • Beds per square meter (bed/m2);
  • Heating degree days (°C);
  • Cooling degree days (°C).
The additional independent variable “machines per square meter” mentioned in the EPA methodology was not included in the analysis since it was not among the data collected from the mandatory energy audits.
Several regression analyses were conducted to define the combination of statistically significant parameters (p-value lower than 0.05). After evaluating the different combinations and the presence of outliers, it was possible to define the regression model using the parameters reported in Table 5. The adjusted R2 value was equal to 0.4677.
By analyzing the results obtained in Table 5, it is possible to make some considerations. The coefficient relating to the energy driver “health workers per square meter” was positive. In contrast, the coefficients obtained for the energy driver “beds per square meter” and “cooling degree days” were negative. All three of these coefficients were statistically significant. It is important to emphasize that the model refers to the total energy consumption (electrical, thermal, and cooling energy), and the energy consumption can have different dynamics for the structures in the dataset. For example, some relevant differences may be due to the geographical position, the main energy users, the presence of self-production systems of energy (e.g., trigeneration systems), and the daily dynamics of the sites. The energy driver “number of beds per square meter” had a negative coefficient due to a different use of the spaces among the structures: a higher amount of beds per square meter translated into a different use of the spaces, which, in turn, could lead to optimized energy consumption (e.g., air conditioning).
Since in the equation, the dependent variable is the total energy consumption of the site (EC) divided by the site’s surface area (i.e., energy use intensity), the explanatory power of the site’s surface area was not included in the R2 value, altering it artificially. Thus, the EPA methodology suggests recalculating the R2 value in terms of energy consumption (EC) [29]:
R 2 = 1 i = 1 38 ( ActualEC i PredictedEC i ) 2 i = 1 38 ( ActualEC i ActualEC _ avg ) 2
The R2 value thus calculated was equal to 0.8350, a more than satisfactory value.
At this point, a health facility can evaluate its energy performance by calculating the energy efficiency ratio, which is given by the ratio between the actual energy use intensity and the predicted energy use intensity, calculated through the regression model. An energy efficiency ratio value lower than one indicates that the health facility uses less energy than expected and is consequently more efficient; on the other hand, a value greater than one indicates lower efficiency.
The energy efficiency ratios of the 38 structures belonging to the sample were then calculated. Figure 4 shows the distribution of the energy performance ratios: the most energy-efficient health facility is located on the far left of the distribution, while the least efficient health facility is on the far right.
The energy efficiency ratios were sorted in ascending order, and we were able to calculate the cumulative percentage for each sample ratio. Finally, through the regression analysis, the equation of the curve was determined, which expressed the value of the cumulative percentage as a function of the energy efficiency ratio. The significance value was set at 0.05. Figure 5 graphically shows the regression performed, while Table 6 shows the results.
After determining its energy efficiency ratio, a health facility that intends to evaluate its energy performance compared to those of the sample can calculate the corresponding cumulative percentage value and identify the percentage of sites in the sample with better or worse performances. For example, a cumulative percentage of 20% indicates that only 20% of the sample has an energy efficiency ratio equal to or lower than its own.

4. Discussion

At this point, it is possible to apply both the methodologies of ENEA and EPA and compare the two results. In particular, two structures were considered. For structure A (ATECO 86.10.10 accredited to the NHS), according to the ENEA methodology, two out of three energy performance indicators were lower than the average value of the respective benchmark indicators, indicating better performances than the average ones. The EPA methodology application resulted in an energy efficiency ratio value less than one, indicating greater efficiency. Indeed, the cumulative percentage was 18%, which meant that structure A was more efficient than 82% of the health facilities in the sample.
Table 7 shows the results obtained for structure A.
According to the ENEA methodology, for structure B (ATECO 86.10.20 accredited to the NHS), there were two out of three energy performance indicators higher than the average value of the respective benchmark indicators, thus indicating slightly worse performances than the average ones. The EPA methodology application results in an energy efficiency ratio value greater than one, indicating lower efficiency. Indeed, the cumulative percentage was 72%, which meant that structure B was less efficient than 72% of the health facilities in the sample. Table 8 shows the results obtained for structure B.
Therefore, the two methods can be considered consistent from the point of view of the results.

5. Conclusions

The analyses carried out in this work made it possible to define energy performance benchmark indicators for the Italian private health sector, following a methodology developed by ENEA. One of the strengths that added value to the analysis was the possibility to rely on an extended dataset from the national mandatory energy audits, which allowed us to define an up-to-date and reliable benchmark for the private healthcare sector.
The analysis carried out concerned ATECO 86.10.10 (general hospitals and nursing homes) and 86.10.20 (specialist hospitals and nursing homes), as they represent the great majority of the sites present in the sample. The best results from the point of view of reliability were obtained for the EnPIs calculated by considering the number of health workers and the covered area as relevant variables, while the worst results were obtained considering the number of beds. However, the same benchmark indicators were also calculated only for health facilities accredited to the NHS. It seemed to improve the reliability of some indicators compared with those defined considering the whole dataset. On the other hand, no relevant results were obtained considering specific macro-areas, such as hospitalization and diagnosis and therapy.
Concerning the reliability of the indicators determined, the results appeared to be acceptable when we considered the whole dataset (Table 2). On the other hand, when we proceeded to subdivide the dataset into macro-areas (hospitalization and diagnosis and therapy, respectively in Table 3 and Table 4), the calculation of the indicators provided low reliability. This low reliability was mainly attributable to a limited number of data in the various macro-categories due to the intrinsic difference of the structures and incomplete data collection.
A suitable solution involves improving the data collection phase for the next cycle of energy audits scheduled for 2023. We are confident that systematizing and simplifying the data collection phase by providing more specific and clearer indications on which parameters to report could significantly increase the reliability of the benchmark indicators.
It should be emphasized that although the subdivision of the dataset into macro-areas and macro-categories produced indicators with generally low reliability, this evaluation was of fundamental importance to identify further opportunities for improvement in view of the next mandatory scheduled audits.
In order to test the reliability of the proposed method, the results were compared with those obtained by using the EPA methodology. This test was made by comparing two different health facilities obtaining comparable results. Therefore, the two methods could be considered consistent from the point of view of results.
Moreover, using the benchmark methodology customized in this study, healthcare facilities can independently assess their energy efficiency in reference to the performance of the Italian private healthcare sector and determine how much their energy efficiency differs from the average of the sector.
Finally, given the valid results obtained in the private health sector, it could be interesting to extend the analyses carried out to the public health sector to compare the public and private health sectors.

Author Contributions

All authors contributed equally to the idea and the design of the methodology proposed and to the deployment of the research project. F.M. and V.I. were responsible for the research activities definition, coordination, and verifications. T.P. and C.M. analyzed the data. D.D. and A.S. prepared the original draft and receipted the suggestions from internal and external reviewers; M.S., F.M., and V.I. contributed to the review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This work was part of the Electrical System Research (PTR 2019–2021), implemented under Programme Agreements among the Italian Ministry for Economic Development (currently Ministry of Ecological Transition) and ENEA, CNR, and RSE S.p.A.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Grassi, W.; Testi, D.; Menchetti, E.; Della Vista, D.; Bandini, M.; Niccoli, L.; Grassini, G.L.; Fasano, G. Valutazione dei Consumi nell’edilizia Esistente e Benchmark Mediante Codici Semplificati: Analisi di Edifici Ospedalieri. Available online: https://www.enea.it/it/Ricerca_sviluppo/documenti/ricerca-di-sistema-elettrico/governance/rse117.pdf (accessed on 8 November 2021).
  2. Mori, A.; Lavinia, C. Caratterizzazione Energetica Delle Strutture Sanitarie del Mezzoggiorno d’Italia. 2015. Available online: https://iris.enea.it/handle/20.500.12079/6728 (accessed on 8 November 2021).
  3. Mori, A.; Martini, S.; Muzi, G. Energetic Characterization of the “G. Brotzu” Hospital Enterprise, San Michele Hospital in Cagliari, According to the Programme Agreement with the Italian Ministry of Economic Development. Energ. Ambiente E Innov. 2010, 5, 72–86. [Google Scholar]
  4. García-Sanz-Calcedo, J.; López-Rodríguez, F.; Cuadros, F. Quantitative Analysis on Energy Efficiency of Health Centers According to Their Size. Energy Build. 2014, 73, 7–12. [Google Scholar] [CrossRef]
  5. Christiansen, N.; Kaltschmitt, M.; Dzukowski, F. Electrical Energy Consumption and Utilization Time Analysis of Hospital Departments and Large Scale Medical Equipment. Energy Build. 2016, 131, 172–183. [Google Scholar] [CrossRef]
  6. García-Sanz-Calcedo, J.; Gómez-Chaparro, M.; Sanchez-Barroso, G. Electrical and Thermal Energy in Private Hospitals: Consumption Indicators Focused on Healthcare Activity. Sustain. Cities Soc. 2019, 47, 101482. [Google Scholar] [CrossRef]
  7. Sheppy, M.; Pless, S.; Kung, F. Healthcare Energy End-Use Monitoring; National Renewable Energy Lab: Golden, CO, USA, 2014.
  8. Borges de Oliveira, K.; dos Santos, E.F.; Neto, A.F.; de Mello Santos, V.H.; de Oliveira, O.J. Guidelines for Efficient and Sustainable Energy Management in Hospital Buildings. J. Clean. Prod. 2021, 11, 129644. [Google Scholar] [CrossRef]
  9. Seifert, C.; Damert, M.; Guenther, E. Environmental Management in German Hospitals—A Classification of Approaches. Sustainability 2020, 12, 4428. [Google Scholar] [CrossRef]
  10. Ji, R.; Qu, S. Investigation and Evaluation of Energy Consumption Performance for Hospital Buildings in China. Sustainability 2019, 11, 1724. [Google Scholar] [CrossRef] [Green Version]
  11. Bawaneh, K.; Ghazi Nezami, F.; Rasheduzzaman, M.; Deken, B. Energy Consumption Analysis and Characterization of Healthcare Facilities in the United States. Energies 2019, 12, 3775. [Google Scholar] [CrossRef] [Green Version]
  12. Chung, M.; Park, H.-C. Comparison of Building Energy Demand for Hotels, Hospitals, and Offices in Korea. Energy 2015, 92, 383–393. [Google Scholar] [CrossRef]
  13. UNI CEI EN 16231: 2012; Energy Efficiency Benchmarking Methodology; European Committee for Standardization: Brussels, Belgium, 2012.
  14. Dahlan, N.Y.; Mohamed, H.; Kamaluddin, K.A.; Abd Rahman, N.M.; Reimann, G.; Chia, J.; Ilham, N.I. Energy Star Based Benchmarking Model for Malaysian Government Hospitals—A Qualitative and Quantitative Approach to Assess Energy Performances. J. Build. Eng. 2022, 45, 103460. [Google Scholar] [CrossRef]
  15. González González, A.; García-Sanz-Calcedo, J.; Rodríguez Salgado, D. Evaluation of Energy Consumption in German Hospitals: Benchmarking in the Public Sector. Energies 2018, 11, 2279. [Google Scholar] [CrossRef] [Green Version]
  16. Singer, B.C. Hospital Energy Benchmarking Guidance—Version 1.0; Lawrence Berkeley National Lab. (LBNL): Berkeley, CA, USA, 2009. [Google Scholar] [CrossRef] [Green Version]
  17. Hwang, D.K.; Cho, J.; Moon, J. Moon Feasibility Study on Energy Audit and Data Driven Analysis Procedure for Building Energy Efficiency: Bench-Marking in Korean Hospital Buildings. Energies 2019, 12, 3006. [Google Scholar] [CrossRef] [Green Version]
  18. Kim, D.W.; Kim, Y.M.; Lee, S.E. Development of an Energy Benchmarking Database Based on Cost-Effective Energy Performance Indicators: Case Study on Public Buildings in South Korea. Energy Build. 2019, 191, 104–116. [Google Scholar] [CrossRef]
  19. Arjunan, P.; Poolla, K.; Miller, C. EnergyStar++: Towards More Accurate and Explanatory Building Energy Benchmarking. Appl. Energy 2020, 276, 115413. [Google Scholar] [CrossRef]
  20. Ma, H.; Du, N.; Yu, S.; Lu, W.; Zhang, Z.; Deng, N.; Li, C. Analysis of Typical Public Building Energy Consumption in Northern China. Energy Build. 2017, 136, 139–150. [Google Scholar] [CrossRef]
  21. de Oliveira Veloso, A.C.; Gonçalves de Souza, R.V.; dos Santos, F.N. Energy Benchmarking for Office Building Towers in Mild Temperate Climate. Energy Build. 2020, 222, 110059. [Google Scholar] [CrossRef]
  22. Shang, L.; Lee, H.W.; Dermisi, S.; Choe, Y. Impact of Energy Benchmarking and Disclosure Policy on Office Buildings. J. Clean. Prod. 2020, 250, 119500. [Google Scholar] [CrossRef]
  23. Estrella Guillén, E.; Samuelson, H.W.; Cedeño Laurent, J.G. Comparing Energy and Comfort Metrics for Building Benchmarking. Energy Build. 2019, 205, 109539. [Google Scholar] [CrossRef]
  24. Benchmarking and Building Performance Standards Policy Toolkit. Available online: https://www.epa.gov/statelocalenergy/benchmarking-and-building-performance-standards-policy-toolkit (accessed on 8 November 2021).
  25. Bruni, G.; De Santis, A.; Herce, C.; Leto, L.; Martini, C.; Martini, F.; Salvio, M.; Tocchetti, F.A.; Toro, C. From Energy Audit to Energy Performance Indicators (EnPI): A Methodology to Characterize Productive Sectors. The Italian Cement Industry Case Study. Energies 2021, 14, 8436. [Google Scholar] [CrossRef]
  26. Degree Days Calculation. Available online: https://www.degreedays.net (accessed on 8 November 2021).
  27. ENEA; Assoimmobiliare Benchmark Di Consumo Energetico Degli Edifici per Uffici in Italia. 2021. Available online: https://www.enea.it/it/Stampa/File/Rapporto_BenchmarkConsumiUffici_EneaAssoimmobiliare_2019.pdf (accessed on 8 November 2021).
  28. EN ISO 50001:2018; Energy Management Systems—Requirements with Guidance for Use; ISO: Geneva, Switzerland, 2018.
  29. ENERGY STAR Score for Hospitals (General Medical and Surgical). Available online: https://www.energystar.gov/buildings/tools-and-resources/energy-star-score-hospitals-general-medical-and-surgical (accessed on 8 November 2021).
Figure 1. (a) Number of sites by ATECO code 86.1, (b) number of beds by ATECO code 86.1, (c) number of workers by ATECO code 86.1, and (d) accreditation of sites to the NHS.
Figure 1. (a) Number of sites by ATECO code 86.1, (b) number of beds by ATECO code 86.1, (c) number of workers by ATECO code 86.1, and (d) accreditation of sites to the NHS.
Energies 15 00806 g001aEnergies 15 00806 g001b
Figure 2. ENEA methodology.
Figure 2. ENEA methodology.
Energies 15 00806 g002
Figure 3. Graphical representation of EnPIbmk_ca (toe/m2) for the ATECO code 86.10.10.
Figure 3. Graphical representation of EnPIbmk_ca (toe/m2) for the ATECO code 86.10.10.
Energies 15 00806 g003
Figure 4. Distribution of the energy efficiency ratios.
Figure 4. Distribution of the energy efficiency ratios.
Energies 15 00806 g004
Figure 5. Cumulative distribution of the energy efficiency ratios.
Figure 5. Cumulative distribution of the energy efficiency ratios.
Energies 15 00806 g005
Table 1. The number of audits for each subcategory of ATECO 86.1.
Table 1. The number of audits for each subcategory of ATECO 86.1.
ATECO CodeDescriptionNumber of Audits
86.10.10General hospitals and nursing homes89
86.10.20Specialist hospitals and nursing homes47
86.10.30Institutes, clinics, and university polyclinics9
86.10.40Hospitals and long-term nursing homes0
Table 2. Results of the EnPIs calculations.
Table 2. Results of the EnPIs calculations.
Sample SitesEnPIbmkEnPIavg ± st.dev Reliability
ATECO 86.10.10EnPIbmk_ca (toe/m2)0.052 ± 0.023Average
EnPIbmk_hw (toe/health worker)2.101 ± 0.950Average
EnPIbmk_b (toe/bed)4.275 ± 2.593Low
ATECO 86.10.20EnPIbmk_ca (toe/m2)0.050 ± 0.031Low
EnPIbmk_hw (toe/health worker)2.278 ± 0.875Average
EnPIbmk_b (toe/bed)7.268 ± 7.453Low
ATECO 86.10.10 accredited to NHSEnPIbmk_ca (toe/m2)0.049 ± 0.023Average
EnPIbmk_hw (toe/health worker)1.959 ± 0.902Average
EnPIbmk_b (toe/bed)3.738 ± 2.010Average
ATECO 86.10.20 accredited to NHSEnPIbmk_ca (toe/m2)0.057 ± 0.030Average
EnPIbmk_hw (toe/health worker)2.426 ± 0.867Average
EnPIbmk_b (toe/bed)8.546 ± 7.711Low
Table 3. Reliability evaluation for the EnPIs related to hospitalizations.
Table 3. Reliability evaluation for the EnPIs related to hospitalizations.
Hospital WardReliability EnPIbmk_dhReliability EnPIbmk_sh
Overall areas of hospitalization
(ward present in 24 sites)
LowAverage
EnPIbmk_sh (toe/m2) = 0.042 ± 0.021
Intensive care
(ward present in 12 sites)
LowLow
Day surgery
(ward present in 10 sites)
LowAverage
EnPIbmk_sh (toe/m2) = 0.051 ± 0.021
Dialysis
(ward present in 2 sites)
--
Gyms and rehabilitation
(ward present in 12 sites)
LowLow
Table 4. Reliability evaluation for the EnPIs related to diagnosis and therapy.
Table 4. Reliability evaluation for the EnPIs related to diagnosis and therapy.
Hospital WardReliability EnPIbmk_nsReliability EnPIbmk_ss
Operating block
(activity provided by 23 sites)
LowLow
Sterilization
(activity provided by 12 sites)
LowLow
Radiology and diagnostic imaging
(activity provided by 27 sites)
LowLow
First aid
(activity provided by 10 sites)
LowLow
Functional and endoscopic examinations (activity provided by 25 sites)LowLow
Transfusion center
(activity provided by 4 sites)
LowLow
Laboratory diagnostics
(activity provided by 19 sites)
LowLow
Table 5. Regression analysis results for Energy Use Intensity.
Table 5. Regression analysis results for Energy Use Intensity.
VariableResults
Dependent variableEnergy use intensity
Observations38
R20.5108
Adjusted R20.4677
Standard error0.0188
CoefficientsSignificance
Intercept0.069650.00004
Health workers per square meter1.472210.00000
Beds per square meter−2.707130.00409
Cooling degree days−0.000150.02352
Table 6. Regression analysis results for the cumulative percentage.
Table 6. Regression analysis results for the cumulative percentage.
VariableResults
Dependent variableCumulative percentage
Observations38
R20.9965
Adjusted R20.9962
Standard error0.0180
CoefficientsSignificance
Intercept0.061770.04378
Health workers per square meter−0.554180.00001
Beds per square meter1.501080.00000
Cooling degree days−0.503570.00000
Table 7. Results of the comparison between ENEA methodology and EPA methodology for structure A.
Table 7. Results of the comparison between ENEA methodology and EPA methodology for structure A.
ENEAEPA
ENEA methodology resultsResults for structure AEPA methodology resultsResults for structure A
EnPIbmk_ca (toe/m2)
= 0.049 ± 0.023
EnPI_ca (toe/m2)
= 0.042
EUI predicted (toe/m2)
= 0.065
EUI actual (toe/m2)
= 0.042
EnPIbmk_hw (toe/health worker)
= 1.959 ± 0.902
EnPI_hw (toe/health worker) = 1.396EER
= 0.63
-
EnPIbmk_b (toe/bed)
= 3.738 ± 2.010
EnPI_b (toe/bed)
= 5.641
Cumulative percentage
= 18%
-
Table 8. Results of the comparison between the ENEA methodology and EPA methodology for structure B.
Table 8. Results of the comparison between the ENEA methodology and EPA methodology for structure B.
ENEAEPA
ENEA methodology resultsResults for structure BEPA methodology resultsResults for structure B
EnPIbmk_ca (toe/m2)
= 0.057 ± 0.030
EnPI_ca (toe/m2)
= 0.063
EUI predicted (toe/m2)
= 0.051
EUI actual (toe/m2)
= 0.063
EnPIbmk_hw (toe/health worker)
= 2.426 ± 0.867
EnPI_hw (toe/health worker) = 3.231EER
= 1.24
-
EnPIbmk_b (toe/bed)
= 8.546 ± 7.711
EnPI_b (toe/bed)
= 4.951
Cumulative percentage
= 72%
-
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Dadi, D.; Introna, V.; Santolamazza, A.; Salvio, M.; Martini, C.; Pastura, T.; Martini, F. Private Hospital Energy Performance Benchmarking Using Energy Audit Data: An Italian Case Study. Energies 2022, 15, 806. https://0-doi-org.brum.beds.ac.uk/10.3390/en15030806

AMA Style

Dadi D, Introna V, Santolamazza A, Salvio M, Martini C, Pastura T, Martini F. Private Hospital Energy Performance Benchmarking Using Energy Audit Data: An Italian Case Study. Energies. 2022; 15(3):806. https://0-doi-org.brum.beds.ac.uk/10.3390/en15030806

Chicago/Turabian Style

Dadi, Daniele, Vito Introna, Annalisa Santolamazza, Marcello Salvio, Chiara Martini, Tiberio Pastura, and Fabrizio Martini. 2022. "Private Hospital Energy Performance Benchmarking Using Energy Audit Data: An Italian Case Study" Energies 15, no. 3: 806. https://0-doi-org.brum.beds.ac.uk/10.3390/en15030806

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop