4.2
Building Performance

Joanna Eley

University activity globally is growing rapidly as our economies depend ever more on an educated, knowledge-based workforce. University estates and campuses are growing to accommodate the expansion in this long-established form of education. The earliest UK universities had similar origins to eleventh-century Bologna, where the first university was created as a community of scholars and students. Today, university communities exist in widely varied physical environments with virtual layers being increasingly overlaid.

Universities within the UK are a large and growing sector of real estate investment, as are student residences for the two million university students. Investment in university buildings is growing in the UK. Regions such as Asia and the Middle East are looking to the UK and USA for expertise in education, research, and building and campus design.

This chapter summarises how 21st-century universities can meet what they aspire to be now and to become in the future, through ensuring that their buildings perform well – and as intended. It looks at what current university clients and their designers are, can and should be doing, to judge the success of university buildings and learn from this for future projects to support the success of each institution.

Good Design Starts with University Performance

Universities depend for their long-term survival in part on how well the buildings and campuses perform to meet their current needs and future aspirations. University clients need designers who can create buildings that help to them to deliver their core business while inspiring the staff and students. They need buildings that communicate their future vision of an environment for enquiry, knowledge and learning. They may need campuses and buildings that create environments only for the academic community, or they may instead seek to create permeable places that welcome local communities and enterprise to co-create the future. Architects and designers therefore need to understand what matters to universities and how their professional expertise can help in its delivery, and clients need to learn how the design professions can help them to achieve their goals.

Universities are regularly judged and ranked on different activities and by different parties. They have two main purposes: research and teaching. In many countries, including the UK, the research excellence of each university, academic department and of each academic member of staff is subjected periodically to scrutiny. Research excellence scores lead to rankings of each university and each department. These form the basis for governments to decide where future research funds from the public purse should best be directed, and for researchers to decide where their interests will best be supported. Architects and design teams able to create buildings in which research activity can be improved will win university projects, but must be prepared to provide irrefutable evidence of the link between research outcomes and building design. Their clients are likely to be cautious about believing claims that cannot be backed up by hard data, some of which will need to come from building performance evaluation.

To help future students decide their preferred university, teaching excellence is assessed by student surveys or other data, such as the average grades for acceptance into a programme of study, and the results are widely publicised and readily accessed. Students are commonly asked to rank aspects of the student experience, such as teaching quality, feedback and assessment, contact hours, organisation and management, facilities and infrastructure.

Architects and designers who can help to improve teaching through the spaces and places they shape are greatly in demand. Not only can they improve the environment for current staff and students but their designs may be able to attract more students of higher academic ability that in turn will create the virtuous circle central to the needs of each university, by setting the stage for standards to continue to improve.7 Evaluation of building performance supports this process.

Evidence of Good Performance

Imagine that, as an architect or client, the design phase of a new project – a refurbishment or a new building – is complete, the construction phase is over, handover has taken place, people have moved in. Is now the time to look back at what has been achieved? Or should everyone have been measuring performance, or value, much earlier in the building process? Would that improve the chance of meeting goals? Common university goals are to:

  • create campuses and buildings that provide spaces to serve today’s users, and that endure by adapting over time to new expectations and needs
  • be spatially efficient, cost effective to run, easy to maintain, comfortable to use, delightful to be in and observe
  • attract the right staff and students.

To deliver designs that meet those needs, designers need to understand what users need now, what matters to them, how they behave, what servicing systems are needed and how these really do perform. In addition, they must understand emerging trends, such as how changes in pedagogy may alter space requirements.

Some of this essential information should be in the brief, and additional understanding comes though participating in its development. To strengthen future designs, architects should learn from evidence of what has worked well in the past, how conflicting agendas have been resolved and unforeseen requirements met. They can learn to identify and emulate good practice from the early stages of briefing, through to design development, construction and handover. In terms of the RIBA Plan of Work, lessons can be learned from Stage 7 – completed university buildings in use – that can then inform Stages 0 and 1, the early stages of briefing for the next project. Learning from experience, creating a virtuous circle where hindsight creates insight and allows effective foresight, will lead to better buildings. This is the virtuous circle that will ensure that the built environment of a future campus community meets the needs of the future university.

Figure 4.1 Feedback leads to better buildings

ifig001 Figure 4.1 Feedback leads to better buildings

Despite wide recognition of the benefits of learning from an evaluation of past projects, their processes and outcomes, POE happens far less often than it should. Many clients are reluctant to reveal any failings after large amounts of capital have been spent and years of effort have been expended. They are not motivated to spend even the relatively tiny amount of money a POE may cost after the building is complete; their concern is now to get on and use it. Professional teams may be reluctant to expose any inadvertent errors and be fearful of any implications for their professional indemnity insurance. The cost of doing a POE was not built into the original project cost – that might have made them less competitive than their rivals. In any case, they have moved onto the next project.

However, experienced clients and their expert design teams recognise the need to define and communicate desired outcomes and test if they have been achieved. Fortunately, university estates’ clients are among the most experienced around. They are serial clients, who commission many large and small projects to maintain, adapt and renew their sizeable estates. In the UK university sector, carrying out POEs has been encouraged by university estate directors, with the intention that the university community will thereby be able to share experience and learn from one another. Techniques such as the Design Quality Indicator (DQI)8 and the HEDQF post-occupancy review have been used to excellent effect.

More evaluative research is needed to pin down important outcomes. Satisfying the core mission – delivering a place that supports high-quality teaching, learning and research over time – is what design must aim for. All the basic, as well as statutory, requirements must be met that any competently designed building should achieve. Challenging sites and demanding and specialised functions must be understood and provided for. Suitable evidence for such issues must be agreed in advance and monitored in execution by all those involved. In addition, designers and clients who are developing new space types to meet new pedagogical needs, as described in earlier sections in this book, need more evidence of what works. Research into innovative spaces for new pedagogy is under way and their use will spread if the evidence proves that they perform well, as discussed by Kenn Fisher on page 127.

University Clients Need to Articulate their Goals

For successful projects, therefore, all those involved must be able to discuss and agree what they are seeking to achieve. Estate clients, architects and other project team members working in conjunction with the academic staff should provide statements of desired outcomes and planned targets. Usually this will require building up a shared understanding and a common vocabulary by spending time together, reviewing papers, visiting related buildings elsewhere and inviting experts to share their ideas.

Link Goals to Ways of Measuring Whether they have been Achieved

Desired outcomes for a campus, masterplan or an individual building, might include statements that the campus should:

  • Support innovation though collaboration and intersection of knowledge.
  • Ensure a supportive workplace for a diversity of users.
  • Make a university a place for all, not only its members.
  • Translate spaces into places, creating routes that facilitate wayfinding, build general awareness, showcase activities, create opportunities to meet and linger.
  • Define thresholds that provide clear, intuitively understood transitions between public, invited and private space.

While such objectives for an evolving campus are likely to be supported by many university clients, the question is: how can such criteria for success be systematically evaluated against the real world campus? If these objectives are critical, then measuring their success should also be agreed and revisited during the evolution of the campus.

Open discussion of the aims of the various stakeholders is essential, as is resolution of conflicting agendas, such as the need to be economical with finances and space and the desire to give academic staff individual offices. Or there may be a conflict between keeping capital costs low with later consequences for increased revenue expenditure over the longer term.

This process is first undertaken at the vision development and briefing stages, and must itself be based on good evidence for the desirability of any conflicting goals, in order to resolve them into a single vision. It must continue throughout the process, maintain a record of the primary objectives, the ‘must have’ results, as well as the ‘nice-to-haves’ in order, after completion, to establish if the objectives have been met and if the project performs well.

Build Quality, Function and Impact

As has been mentioned, university performance is measured by methods such as the results of student and staff surveys and periodic peer assessment of research quality. Such ratings provide little information to help understand how the campus and buildings support universities in their primary objectives. Better questions about the impact and importance of buildings in providing students with the environment they need and academics with good places to teach and do their research would help universities target their investment in the built environment wisely.

The estate, the landscape, buildings and interiors, must also work in their own right. Each can be judged by how well it satisfies the trio of ‘firmness, commoditie and delight’, as Sir Henry Wooton translated Vitruvius’s three characteristics essential to buildings: firmitas, utilitas and venustas.9 The DQI – launched in the UK in 2003 and in the USA in 2006 – reinterpreted these as build quality, function and impact.10

Build Quality

  • At present (2016) building performance measurement often concentrates on build quality (‘firmness’) particularly energy efficiency, reduction of carbon dioxide (CO2) emissions or other quantifiable and thus measurable aspects of environmental ‘sustainability’, such as waste and pollution or water use.11
  • More fundamental aspects of build quality – such as structural soundness or weather enclosure – are usually taken for granted by a client.

Function

  • Whether any specific user group is satisfied and has the spaces it needs, can be investigated using the various POE methods discussed below.

Impact

  • If a building is liked and is praised for its qualities, it may signify that it provides delight or has a positive impact.
  • Whether a building meets stakeholder expectations may be measured against desired outcomes.
  • Designers and clients are proud to receive design awards signifying public praise; however, subjective criteria sometimes creep in, so an award may not guarantee quality or appropriateness.

Creating a Feedback Loop

Universities are in a good position to establish feedback loops. Estates departments in universities share experience through various groups, so they can learn from each other. Despite considerable differences between individual universities, they face similar challenges: estates that are continuously changing to meet evolving needs, new pedagogies, changes in IT and shifting expectations of their users, as well as a generally similar annual timetable where much work is done over the summer when main teaching sessions are not taking place. With support from the AUDE and the HEDQF, the POE culture could – and indeed should – be stimulated so that students and academics benefit from better places to learn and conduct research created by a more intelligent and informed construction industry, providing higher-quality buildings. The need to learn and to pass on lessons gets more acute as change accelerates and as competition for the best students and academics becomes more intense.

Feedback supports better future performance and helps to create a ‘learning organisation’. Evaluation can highlight ‘quick wins’, immediate, low-cost improvements for the benefit of current users.

Despite the increased acceptance of the concept of POEs, even when these are conducted, the results are not well publicised. Some comparative information is available from the websites of the Usable Building Trust12 and of the HEDQF.13 However, as mentioned, neither clients nor professionals yet share such information freely as a matter of good practice. The RIBA and other professional institutes, and AUDE on behalf of the client body, should be encouraged to put more emphasis on the importance of shared information and feedback to increase the value provided by design. Designers could be rewarded for publicising POE results, while clients should be encouraged to invest in sharing such information.

Look Back and Forward

Projects can take several years from the first vision though a briefing, design and construction process to completion. Are there obvious times to check performance? Should one wait until after completion to review what has been built? How long after handover is a building really finished? Can it be evaluated during the design process?

A ‘wash-up’ session – a post-construction review – does not measure performance or quality. It happens too soon to do more than start to evaluate the process. Awards and prizes are commonly judged too early if based on visits when a project has just been completed and may still have snags. A POE done six months to a year after the building occupants move in is more realistic. A good time to start is after a full teaching year and with experience of the annual weather pattern. A building about to be replaced or extensively renovated should be evaluated before work starts, so that the brief can be systematically informed by current experience.

Visits to other buildings before finalising a brief, data about their performance and the views of estates, facilities managers and students will help crystallise specific outcomes to be met by the new project, identifying some of the post-occupancy issues that they illustrate.

Many processes emphasise the importance of evaluation as a regular rather than a one-off activity:

  • Building performance evaluation (BPE) methods stress the importance of evaluation and review at all stages during the procurement cycle: planning, briefing, design, post-construction stages and during occupation.
  • Soft Landings, adopted for UK public buildings, monitors some aspects to agreed targets throughout the project and may continue for years afterwards.
  • Annual estate measurement statistics (EMS data) have to be provided to the Higher Education Funding Council of England (HEFCE), including teaching space utilisation, functional suitability and other building-based measures.

One can also ask how well a building has ‘stood the test of time’, perhaps many years after the building is completed and occupied, after many modifications in its form and function, renewals of short-life elements, changes of fashion in aesthetic or other design areas. This is in itself a measure of quality, and ways to measure this are needed.

Examples of Characteristics to Measure

Build quality

  • Internal environmental quality, energy efficiency, CO2 emissions, water consumption etc. testing actual use against predictions
  • Health and wellbeing data
  • Cost in use – whole-life value

Function

  • Accessibility and ease of wayfinding
  • Amounts of space, adjacencies and the extent to which they match the brief
  • Appropriate levels of utilisation of spaces for general and specialist teaching, learning, offices and other uses – a potentially wider range than required by HEFCE
  • Productivity – perceived or actual
  • Use of flexibility and adaptability features
  • Ease of management

Impact

  • User satisfaction with comfort – air quality, temperature, visual, noise, etc.
  • Client satisfaction – meeting desired outcomes
  • Improvements/changes desired by users
  • Awards – immediate and over time
  • Image /brand recognition,
  • recruitment/retention of staff and students

What and how to Measure

If you can’t measure it you can’t improve it. Lord Kelvin

The box (left) suggests targets to consider. Some exist in regulation or guidance, others may need to be agreed between the client and the design team. As the project evolves, adjustments may be needed which must be agreed and recorded.

Measuring success in meeting these targets can be done in a range of ways.

William Bordass, in Designing Better Buildings,14 describes four techniques for collecting the necessary information:

  • 1 observation: of exemplars and then the project in use
  • 2 facilitated discussions
  • 3 questionnaires/interviews
  • 4 physical monitoring and analysis of performance statistics.

This matches closely to the well-established WorkWareLEARN toolkit15 (by Alexi Marmot Associates) used for HEI buildings, allowing relevant data to be systematically collected and reviewed against agreed targets, and benchmarked against other projects and places.

The BRE Design Quality Matrices described for buildings generally in The Design Quality Manual: Improving Building Performance16 suggest five broad categories of information to be collected:

  • 1 Architecture: a subjective measure
  • 2 Environmental engineering: against objective figures
  • 3 User comfort: against measureable conditions
  • 4 Whole-life costs: overall occupancy costs
  • 5 Detailed design: as poor details or bad execution often lead to ‘failures’.

The University of Sunderland (see box below) suggested desired outcomes for their refurbished labs.17 These are expressed in terms specific to their needs and framed their own project, though many, slightly reworded, would apply to other buildings. Some of them

University of Sunderland Objectives for their Sciences Complex Refurbishments

  • Reinvigoration of existing buildings to provide upgraded laboratory teaching and office facilities.
  • Transition from department-owned to cross-faculty facilities.
  • New work environments for staff, supporting interaction and collaborative working.
  • Provision of flexible and effective laboratory research facilities.
  • Accommodating larger group sizes in laboratories and increasing space utilisation.
  • Efficient spacial use of shared facilities supported by good practice in timetabling.
  • Positive impact on student experience.
  • Vacating poor quality space as a result of efficiencies in space use through consolidation and shared facilities.

will be easier to ‘measure’ than others but for each a way to measure could be defined and many could be measured before the project starts to create a benchmark for success. A POE can also be conducted for a whole campus (see box below, right).

Evaluation Tool Kits

There are many available measurement ‘tools’, often grouping several different methods as ‘kits’. Several stress the need to precede a project with an assessment to establish goals and targets so that later it is possible to monitor progress in achieving them.

Some tools in regular use include:

  • The HEDQF ‘no shame, no blame’ assessment method.18 A day of workshops covers briefing, design, construction and use, attended by relevant stakeholders including the staff and student users. Positive and negative lessons for each stage of the process are reported to improve subsequent projects.
  • AUDE and HEFCE sponsored a guide to POE,19 a comprehensive document explaining the purpose of POE and incorporating templates and checklists for those wishing to measure against very full checklists and numerical targets.
  • Alexi Marmot Associates’ WorkWareLEARN toolkit has been used for over 200 university buildings and provides information from over 15,000 staff and students, allowing universities to evaluate and benchmark their building and estate performance against others.
  • Leesman developed their office index to create ‘Leesman Education’20 to collect self-reported subjective views: ‘My-Uni’ covers 11 aspects of how students feel the university buildings impact on their educational experience. The Leesman Campus Survey focuses on employee assessments of the buildings’ ability to support them and how if affects their sense of pride and productivity, as well as satisfaction with specific features and services.
  • Space utilisation surveys (SUS) provide observed data on how often teaching rooms are used (frequency in relation to number of opportunities during an observation period), and by how many students (occupancy in relation to capacity). This provides a proxy for efficient use of space.21 Similar techniques were originally developed for office space and are sometimes used by universities to review the way their academic and support offices are used.
  • Building Use Studies created a survey tool that collects the views of building users – generally in offices, though now used under licence in a wide range of building types.22 Its focus is on comfort and building services as well as users’ perceptions of the support for their activities provided by the building.

Tools to help make decisions must be used with care. In a questionnaire seeking people’s opinions, the number of respondents matters. When a prize is awarded, the qualifications of the judges are relevant. When key performance indicators (KPIs) are measured, the way data is collected should be known to understand its value. Realising that people learn in different ways is leading to pedagogical changes, using new space types and new furniture arrangements.

Now the challenge is to spread the information more widely.

POE Could take Place at Campus Level

Build quality

The Dublin Institute of Technology estate at Grangegorman has started with a clear fix for its brand new campus layout in particular the ground-level treatment and landscaping to provide the opportunity for an entire campus to rise gradually but coherently from a blank canvas. A POE in say 2020, will be able to ask whether the detailed campus planning rules, enshrined in an immutable planning permission, have succeeded in creating what they wanted: a lively pedestrian place, well connected to the local area, attracting top quality students and staff and allowing collaboration between hitherto separate disciplines.

Universities are Increasingly using these Various Techniques as a Matter of Good Practice and are Benefitting from them

Bath Spa University, in its new Commons Building, is using a range of different POE methods including space occupancy and timetable analysis, as well as interviews with those who developed the brief and current users. This is being done as a matter of good practice and, crucially, to provide insight. Commons has pioneered new settings for informal learning and for staff workplaces: how these are used, what people feel about them, what it has achieved for their objective to create better collaboration in and between both groups provides valuable evidence for future projects as people acclimatise to the new ‘norms’.

Queen Mary University of London has reviewed a refurbishment project using the HEDQF single-day process and has been able to capture important ideas to use immediately on a similar project.

Lessons from Evaluation

The same mistakes are still made time and again. Some findings from POEs appear regularly. Openness about both success and failure is essential.

Results from 16 PROBE POEs on office buildings carried out between 1995 and 2002, looking primarily at building services and user comfort, found some recurrent problems even though the buildings were generally good.23 These included, among others:

  • interface issues between work packages and for users
  • shortcomings with handover processes
  • user dissatisfaction with environmental comfort
  • higher than anticipated energy use.

HEDQF workshops regularly find communication problems, loss of continuity of team members and the difficulty of providing academic staff with time to participate effectively in all stages. If the client – the university – cannot find suitably motivated individuals and provide them with space and time in their schedules to act as true client representatives, it should be no surprise if a building project is in some respects disappointing to the users.

Questionnaires frequently show users to be least satisfied with heating, cooling, air quality and noise. University staff and students regularly criticise inadequate WiFi and mobile phone coverage and strength. Uncomfortable furniture – especially chairs with attached writing surfaces – and inflexibility are regular targets for dissatisfaction. Reviews of finished buildings show that energy use models, prepared early in a project to guide services design, have not accurately predicted the final built situation. For example, Carbon Buzz data,24 provided by a selection of HEIs, shows that on average CO2 emissions are almost twice that targeted: 39.9 kg CO2/sqm/yr at the design stage versus 77.8 kg CO2/sqm/yr when actual energy data in use are reviewed. The detail behind this finding indicates a wide spread between those that performed well and those that were much worse than predictions. The latter can learn from the former.

There is, however, a note of caution in relation to energy use here: one major issue is that the building models are generally compiled for building regulation compliance. They usually contain unrealistic usage patterns and exclude unregulated energy (equipment), which can be significant. A successful building may be more intensively used and have longer operational hours, both factors likely to push up energy consumption, sometimes reflecting a popular and well used building and excellent use of resources. Perhaps carbon emissions should be measured against financial turnover, teaching outcomes or research outcomes/income?

Acting on all these regular and disappointing findings must be built into project governance, be explicit in briefs and be costed and allowed for in the monitoring and management of the design and construction process. It is unacceptable to keep making the same mistakes.

Research to Answer Questions

Some questions remain at least partially unanswered and occasionally unasked. More research is needed, though it must avoid the type of bias illustrated.

Students’ Interest in the Physical Environment

The HEDQF and AUDE have carried out some research to show that students feel that ‘Place Matters’.25 It can be hard to survey students and academics about this, but more work is needed to make it happen. The views of the users need to be systematically sought by whatever means. The case study from the LSE on pages 162–167 is an example of the value of seeking this feedback.

Solo Offices for Academics

UK universities have wrestled with whether academics can be asked to work in open-plan areas without destroying their ability to meet students and do great research. Questions fly around the university estates network asking where open plan for academics has worked. There seem to be few answers, although newer buildings are getting bolder about the use of open plan than historic campuses. There is an urgent need for careful evaluation and communication with the users on this subject.

An Office of One’s Own is Overwhelmingly Desired by Academics

This type of communication and ‘research’ will confirm the existing bias.

One English faculty faced with the ruling that they would have to share larger offices in an old building, where room sizes could not be altered, created badges for all staff and room door stickers saying ‘Virginia Woof – A Room of One’s Own.’

In August 2015 the Guardian Higher Education Network circulated the following: ‘Calling all senior academics: has your office been turned into an open-plan space? We’re looking to hear from those who hate working in a shared office to talk about how it’s impacted on their work. Email me if that’s you.’ Some university departments have accepted much smaller offices, but one of the key issues is that the nature of academic work which is the most highly rewarded – scholarly writing for peer reviewed journals and research grants – is, for various reasons, best carried out in an office.26 This was tested most recently at TEFMA in Wollongong,27 where an audience of primarily Directors of Estates noted that there was very little uptake of activity-based workplaces by academics.

From Kenn Fisher’s literature review and several academic office consultant reports, it appears that the key issues needing to be understood are:

  • cross-disciplinarity
  • status/hierarchy
  • power relations
  • academic work practices
  • the economics of offices
  • temporal issues
  • environmental systems
  • psycho-social factors
  • acoustics
  • labour relations
  • surveillance.

Until these issues have been examined and analysed across all academic disciplines in significant depth, it is unlikely that open-plan or activity-based workspaces will be accepted by the academic community, at least in Australian universities where this research is taking place.

The Australian perspective on academic office mirrors UK experience. Questions into changing space use and capacity issues there have suggested investigations into how often spaces are changed to meet needs, and what features of design make the alterations easy or cause problems. Follow-up could be integrated into POE studies, to understand what changes have been implemented and why, and how long after the initial creation of the space.

Utilisation of specialist and innovative teaching and learning spaces needs to be rethought. Pedagogical changes, shifts in electronic interaction, new emphasis on shared learning, project-based science teaching and new student expectations make room capacities fluid and changes how space-use efficiency and effectiveness in meeting university goals should be judged.

Aligning Pedagogy with Space

In concluding this book with a section on ‘performance’, the fitness for purpose of learning environments must focus on pedagogical and curriculum issues. In the next section, Kenn Fisher, at the University of Melbourne, Australia, discusses his experience of research taking place.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.146.221.204