Chapter 10

The End-to-end Performance of LTE-Advanced

Marc Werner, Valeria D'Amico, David Martín-Sacristán, Jose F. Monserrat, Per Skillermark, Ahmed Saadani, Krystian Safjan and Hendrik Schöneich

This chapter addresses the end-to-end performance of one particular IMT-Advanced system, namely the Third Generation Partnership Project (3GPP) LTE-Advanced (LTE-A). This technology moves forward from 3GPP Long Term Evolution (LTE) Release 8 (Rel-8), which is currently starting to be deployed in selected countries, including some of the innovations described in the previous chapters. By LTE-A we refer to Release 10 (Rel-10) and beyond of the 3GPP LTE standard. This version of LTE was submitted to the International Telecommunication Union – Radiocommunication Sector (ITU-R) for evaluation and obtained the official International Mobile Telecommunications Advanced (IMT-Advanced) label. The Wireless World Initiative New Radio + (WINNER+) project evaluated LTE-A in its role as official IMT-Advanced Evaluation Group.

The chapter starts with a description, in section 10.1, of the ITU-R process for defining IMT-Advanced radio access technologies. The ITU-R evaluation scenarios and performance requirements are characterized. Then, some key features of LTE Rel-10 are introduced in section 10.2, clarifying which of the technologies described in the previous chapters are included in this system. The performance of LTE-A in International Telecommunication Union (ITU) scenarios, according to the 3GPP self-evaluation as well as extensive simulations by the WINNER+ Evaluation Group, is evaluated and analyzed in section 10.3. The channel model and simulator calibration activities in the WINNER+ Evaluation Group, which led to a consolidated set of evaluation results, are described in sections 10.4 and 10.5. The chapter ends with an outlook on the further IMT-Advanced process.

10.1 IMT-Advanced Evaluation: ITU Process, Scenarios and Requirements

In 2008, the ITU-R Working Party 5D (WP 5D) started its formal process for the identification of “Fourth Generation (4G)” radio access technologies to be qualified as IMT-Advanced systems. The IMT-Advanced label from ITU-R is especially important from a radio regulatory perspective because the operation of a mobile communication system in certain frequency bands might be restricted to IMT-Advanced systems in some areas. This was the case, for example, for Universal Mobile Telecommunication System (UMTS) bands in Europe, which were reserved for International Mobile Telecommunications 2000 (IMT-2000) systems. ITU-R describes IMT-Advanced systems as “[...] mobile systems that include the new capabilities of IMT that go beyond those of IMT-2000. Such systems provide access to a wide range of telecommunication services including advanced mobile services, supported by mobile and fixed networks, which are increasingly packet-based [...]” (ITU-R 2008a). This definition is technology agnostic and service oriented. However, IMT-Advanced systems are based on new radio interfaces to differentiate themselves from mere IMT-2000 (Third Generation (3G)) enhancements. The only specific technical performance target for 4G established from the beginning was a peak data rate of 100 Mbps for high mobility and 1 Gbps for low mobility.

Report ITU-R M.1645 (ITU-R 2003) describes the trends towards an increased demand for wireless communication (in particular, an increased number of users, higher data rates, and an increased quality of service) and translates the increased demand into new technical requirements for 4G systems. Figure 10.1 (the so-called ITU van diagram) illustrates the relation between IMT-2000 and systems beyond IMT-2000 (now called IMT-Advanced).

Figure 10.1 Illustration of capabilities of IMT-2000 and IMT-Advanced (ITU-R 2003). Reproduced by permission of © 2008 ITU

img

10.1.1 ITU-R Process for IMT-Advanced

In order to identify IMT-Advanced Radio Interface Technologys (RITs), a stepwise process was defined by the ITU-R. The process is summarized in the following list and applies to the period from 2008 to 2011:

1. Issuance of a circular letter: invitation for submission of IMT-Advanced proposals.

2. Development of candidate RITs and Set of RITs (SRIT).

3. Submission and reception of the proposals.

4. Evaluation of RITs/SRITs by evaluation groups.

5. Review and coordination of outside evaluation activities.

6. Review to assess compliance with minimum performance requirements.

7. Consideration of evaluation results, consensus building and decision.

8. Development of radio interface recommendation(s).

In accordance with the first step, in March 2008, the ITU invited all its member states and radiocommunication sector members in a circular letter (ITU-R 2008b) for a submission of technology proposals of RITs for IMT-Advanced. The circular letter also referred to supporting documentation in which the process and materials required for submission were explained. All candidate proposals had to be accompanied by a self-evaluation of the RIT according to the IMT-Advanced performance requirements described in Report ITU-R M.2134 (ITU-R 2008c), also known as “IMT.TECH” among the evaluation groups. These performance requirements are reproduced in section 10.1.3. Report ITU-R M.2135 (ITU-R 2009c), known as “IMT.EVAL”, specified further guidelines for evaluation of RITs for IMT-Advanced, such as evaluation methods, simulation procedures, test environments and deployment scenarios, antenna characteristics and, in particular, a dedicated spatial channel model (on the basis of the WINNER-II models (Döttling et al. 2009; WINNER-II 2008) and 3GPP Spatial Channel Model (SCM) (3GPP 2009c)) to be used in all evaluations. Details about the IMT.EVAL evaluation scenarios are described in section 10.1.2 and the channel model implementation and calibration are described in section 10.4.

By October 2009, a number of IMT-Advanced RIT proposals were submitted to ITU-R in accordance with the circular letter. All submissions were based on either the 3GPP system LTE-A, or on IEEE 802.16m, that is, Worldwide Interoperability for Microwave Access (WiMAX). The evaluation groups therefore had to assess only two different system proposals (both of which, however, came in different duplexing configurations, that is, Frequency Division Duplex (FDD) and Time Division Duplex (TDD) variants). The submissions were accompanied by extensive self-evaluations as required by the ITU-R process.

Some evaluation groups were registered in ITU-R during the development phase of candidate proposals. They were asked to assess any combination of performance requirements for any of the system proposals. The WINNER+ evaluation group set its focus on the LTE-A proposal while also reviewing the WiMAX proposal, and keeping track of its evaluation. In the following, a list of all registered evaluation groups is given:

  • ARIB Evaluation Group
  • ATIS WTSC
  • Canadian Evaluation Group (CEG)
  • Chinese Evaluation Group (ChEG)
  • ETSI
  • Israeli Evaluation Group (IEG)
  • Russian Evaluation Group (REG)
  • TCOE India
  • TR-45
  • TTA PG707
  • UADE, Instituto de Tecnología (Argentina)
  • WiMAX Forum Evaluation Group (WFEG)
  • Wireless Communications Association International (WCAI)
  • WINNER+

During the evaluation period, several workshops were held by both 3GPP and Institute of Electrical and Electronics Engineers (IEEE) for the external evaluation groups in order to discuss the evaluation status and provide answers to arising questions. Moreover, the evaluation groups cooperated in the implementation of IMT.EVAL recommendations. For instance a calibration of channel model implementations was achieved between the WINNER+ and Chinese evaluation groups.

The evaluation of the LTE-A system proposal according to steps 4-6 of the IMT-Advanced process mentioned above is described in the subsequent sections of this chapter.

10.1.2 Evaluation Scenarios

According to the evaluation rules of ITU-R (ITU-R 2009c), IMT-Advanced candidate proposals need to fulfill a set of 13 minimum performance requirements in four specific test environments that reflect future use cases of IMT-Advanced systems. Each environment is associated with a deployment scenario that specifies the simulation setup, for example intersite distance, carrier frequency, maximum transmit powers and channel model. The following test environments are defined:

  • Indoor: indoor environment targeting isolated cells at offices and/or in hotspots based on sta- tionary and pedestrian users.
  • Microcellular: urban microcellular environment with higher user density focusing on pedestrian and slow vehicular users.
  • Base coverage urban: urban macrocellular environment targeting continuous coverage for pedestrian up to fast vehicular users.
  • High speed: macrocellular environment with high-speed vehicles and trains.

Each test environment focuses on a specific application for the candidate RIT/SRITs and is accompanied by specific values of the performance criteria to be met by the RIT/SRITs. For each of these test environments, at least one deployment scenario was defined to be used for the performance evaluation of candidate RIT/SRIT (see Table 10.1 and Figure 10.2).

Figure 10.2 Deployment scenarios (ITU-R 2009c). Reproduced by permission of © 2008 ITU

img

Table 10.1 Test environments and deployment scenarios.

Test environment Deployment scenario
Indoor Indoor Hotspot (InH)
Microcellular Urban Microcell (UMi)
Base Coverage Urban Urban Macrocell (UMa)
Suburban Macrocell (SMa) – optional
High Speed Rural Macrocell (RMa)

The deployment scenarios given in (ITU-R 2009c) are:

  • Indoor Hotspot (InH): small isolated cells at offices or hotspot areas. Targets high user throughput and high user density. All users are pedestrians. Two base stations operating at 3.4 GHz with an omnidirectional antenna setup are mounted on the ceiling of a long hall with adjacent offices.
  • Urban Microcell (UMi): high traffic and user density for city centers and dense urban areas. Outdoor and outdoor to indoor propagation characteristics for pedestrian users are assumed. Continuous hexagonal deployment is used with three sectors per cell and below rooftop antenna mounting. Base stations operate at 2.5 GHz and have an intersite distance of 200 m.
  • Urban Macrocell (UMa): targets ubiquitous coverage for urban areas. A similar hexagonal deployment is used with larger intersite distance of 500 m and antennas mounted clearly above rooftop. Non-line-of-sight or obstructed propagation conditions are common for this scenario. Only vehicular users at moderate speed are assumed, suffering from an additional outdoor to in-car penetration loss. Base stations operate at 2 GHz.
  • Rural Macrocell (RMa): similar to UMa, but targets larger cells with support for high-speed vehicular users. Base stations have an intersite distance of 1732 m and operate at 800 MHz, which is more suitable for large cells.
  • Suburban Macrocell (SMa): this is an optional scenario for the same test environment as of the UMa scenario. The key difference is an increased intersite-distance of 1299 m and a mix of indoor and high-speed vehicular users.

10.1.3 Performance Requirements

In Report M.2134 (ITU-R 2008c), ITU-R specifies minimum performance requirements that candidate radio technologies must fulfill. The evaluation criteria are grouped by their evaluation method, which is either inspection, analytical or simulation.

Evaluation by inspection only requires evaluation groups to check if the candidate proposal addresses and meets the requirement. Inspection requirements are:

  • Bandwidth: the candidate systems must support scalable bandwidth allocations up to and including 40 MHz. Furthermore, proponents are encouraged to support higher bandwidths of up to 100 MHz. It must be possible to demonstrate that the systems support at least three different bandwidths allocation including the minimum and maximum value for the candidate system.
  • Intersystem handover: the candidate systems must support intersystem handover between the candidate IMT-Advanced system and at least one IMT-2000 system.
  • Deployment possible in at least one of the identified International Mobile Telecommunications (IMT) bands.
  • Channel bandwidth scalability.
  • Support for a wide range of services: candidate systems must be able to support multiple service classes such as background, streaming, interactive and conversation.

Analytical evaluation involves some calculations to determine whether the candidate meets the minimum requirement. The analytical requirements are:

  • Peak spectral efficiency: gross data rate offered by the physical layer of the candidate technology. This criterion allows to estimate the overhead introduced by the physical layer.
  • Control plane latency: this allows for estimation of call setup duration. It is measured by the state transition time, for example the time interval between the idle and active mode of a User Equipment (UE).
  • User plane latency: a minimum transmission time for IP-Packets through the radio access network is required.
  • Intrafrequency and interfrequency handover interruption time: candidates are required to support seamless handovers between cells of the system. Therefore a minimum handover interruption time is required.

Some aspects of IMT-Advanced candidate systems cannot be investigated analytically and need to be addressed by simulation. These requirements are:

  • Cell spectral efficiency: IMT-Advanced systems should provide their users with high data rates. The assigned spectrum must be utilized efficiently.
  • Cell-edge user spectral efficiency: high data rates must be provided to users, while at all times a minimum data rate should be available to cell edge users. Cell spectral efficiency and cell-edge user spectral efficiency are to be determined in the same simulation runs.
  • Mobility: the candidate system should be able to operate at UE speeds of up to 350 km/h. This is evaluated by link-level simulations.
  • Voice over IP (VoIP) capacity: IMT-Advanced systems should not only be able to support high data rates but also a large number of users. The VoIP capacity is used to evaluate the maximum load of users – with rather low traffic demands – that can be supported.

Tables 10.2 10.3, and 10.4 provide the numerical values of inspection, analytical, and simulative performance requirements set by ITU-R for IMT-Advanced candidates.

Table 10.2 IMT-Advanced performance requirements to be assessed by inspection.

Parameter [unit] Required value
Bandwidth [MHz] up to 40
Intersystem handover supported
Deployment possible in at least one of the identified IMT bands possible
Support for a wide range of services All service classes defined in (ITU-R 2009c) in at least one test environment
Channel bandwidth scalability Support of at least three band-width values

Table 10.3 IMT-Advanced performance requirements to be assessed by analytical calculation.

Parameter [unit] DL/UL Required value
Peak spectral efficiency [bps/Hz/cell] DL 15
UL 6.75
Control plane latency [ms] n.a. <100
User plane latency [ms] n.a. <10
Intrafrequency handover interruption time [ms] n.a. 27.5
Interfrequency handover interruption time within a spectrum band [ms] n.a. 40
Interfrequency handover interruption time between spectrum bands [ms] n.a. 60

Table 10.4 IMT-Advanced performance requirements to be assessed by simulation.

img

The subsequent sections provide an overview of the evaluation work that has been carried out in WINNER+ to analyze the LTE-A performance with respect to the three groups of requirements. The detailed assessment results are reported in a WINNER+ deliverable (WINNER+ 2010).

10.2 Short Introduction to LTE-Advanced Features

In 2008, 3GPP held two “3GPP IMT-Advanced Workshops” (Network 2008a, 2008b). The goal of these workshops was to investigate the main changes that could be brought forward to evolve the Evolved Universal Terrestrial Radio Access (E-UTRA) radio interface as well as the Evolved Universal Terrestrial Radio Access Network (E-UTRAN) (3GPP 2010a) in the context of IMT-Advanced. In particular, the LTE-A study item was initiated in order to study the development of LTE based on a new set of requirements. This initiative collected operators' and manufacturers' views in order to develop and test innovative concepts that will satisfy the needs of next-generation communications. The requirements were gathered in 3GPP TR 36.913 (3GPP 2009b). The resulting technical report was published in June 2008 and a liaison was sent to ITU-R covering the work in 3GPP RAN on LTE-A towards IMT-Advanced. Finally, 3GPP contributed to the ITU-R towards IMT-Advanced via its proposal “3GPP LTE Rel-10 & Beyond (LTE-Advanced)” (ITU-R 2009a).

The new technical features of LTE-A are defined in 3GPP TR 36.814 (3GPP 2010b). The main technical features are:

  • Support of wider bandwidth

    Carrier aggregation, where two or more component carriers are aggregated, is considered for LTE-A. Each carrier is assigned a bandwidth up to 20 MHz, in order to support Downlink (DL) transmission bandwidths larger than 20 MHz, for example, up to 100 MHz. A UE may simultaneously receive or transmit one or multiple component carriers depending on its capabilities.

  • Extended multiantenna configurations

    Extension of LTE DL spatial multiplexing up to eight layers is considered. For the UL, spatial multiplexing to up to four layers is supported.

  • Coordinated multipoint transmission or reception (CoMP) transmission and reception

    This feature is considered as a tool to improve the coverage of high data rates, the cell-edge throughput and/or to increase system throughput. It implies dynamic coordination among multiple geographically separated transmission points. However, for LTE-A Rel-10, there will be no new communication link for support of CoMP in the inter-eNodeB (eNB) mode, that is, through the X2 standardized interface between eNBs.

  • Relaying functionality

    Relaying is supported for LTE-A as a tool to improve, for example, the availability of high data rates, group mobility, temporary network deployment, the cell-edge throughput and/or to provide coverage in new areas.

For more details details about the LTE-A features the readers are referred to section 6.2 (for CoMP), to section 9.3 (for extended multi-antenna configurations), and to section 7.2 (for relaying functionality).

10.2.1 The WINNER+ Evaluation Group Assessment Approach

The WINNER+ project produced consistent research work on optimization of the radio interface concepts for IMT-Advanced systems, based on the heritage of the activities carried out in the former European Union framework program 6 project Wireless World Initiative New Radio (WINNER). In addition to developing enabling technologies for the WINNER+ system concept, in November 2008, WINNER+ registered as an independent evaluation group towards ITU-R in order to drive the IMT-Advanced development process.

The evaluation procedure is designed in such a way that the overall performance of the candidate RITs/SRITs may be fairly and equally assessed on a technical basis. Based on the available expertise in IMT-Advanced radio technology concepts and in link- and system-level simulation tools, the WINNER+ Evaluation Group targeted the 3GPP LTE Rel-10 & Beyond (LTE-Advanced) proposal (ITU-R 2010b). The WINNER+ group evaluated all 13 minimum requirements for IMT-Advanced systems according to (ITU-R 2008c) (see section 10.1) by means of analytical, inspection and simulation activities in order to perform a full evaluation of the LTE-A candidate technology.

For simulation purposes, in order to guarantee the reliability of the results, evaluated characteristics were assessed by a plurality of partners. During the course of the work, emphasis was placed on reflecting realistic behavior of the system under consideration by modeling nonideal aspects including, for example, effects of channel estimation errors, channel quality indicator (CQI) measurement errors and feedback delay as well as a correct modeling of the overhead in the system. Simulators of different partner organizations were calibrated in order to provide consistent results.

10.3 Performance of LTE-Advanced

10.3.1 3GPP Self-evaluation

Together with the submission of LTE Rel-10 as an IMT-Advanced technology proposal, 3GPP also performed and submitted an evaluation of the LTE Rel-10 technology. Such an evaluation, in which the technology proponent evaluates its own radio interface proposal to verify that it meets the requirements, is referred to in ITU-R as a self-evaluation. The 3GPP self-evaluation covers the FDD RIT and the TDD RIT and is part of the submissions of LTE Rel-10 to ITU-R. It is also available in the 3GPP technical report (3GPP 2009a, Section 16).

In short, the 3GPP self-evaluation report covers all the technical, spectrum and service requirements. The overall conclusion is that both the FDD RIT and the TDD RIT fulfill all the requirements in the four test environments and, hence, LTE Rel-10 meets all the IMT-Advanced performance requirements.

Technical requirements are evaluated by means of inspection, analytical evaluation, or simulations, whereas the spectrum and service requirements are evaluated by means of inspection only. As for inspection and analytical evaluations, 3GPP summarized the necessary data and provided explanatory information regarding how LTE Rel-10 performs in the respective areas. As for the evaluations that are performed by means of simulations, several organizations contributed to this work and the 3GPP self-evaluation report provides the average values of all samples together with the number of samples.

Table 10.5 summarizes the 3GPP self-evaluation results for the different technical requirements. In many of the evaluations, such as, for example, cell spectral efficiency and VoIP capacity, 3GPP reported results for different antenna configurations and transmission schemes. In addition, for the DL cell spectral efficiency and cell-edge user spectral efficiency evaluations 3GPP performed evaluations for different Physical Downlink Control CHannel (PDCCH) overhead assumptions (L = 1, 2, or 3 Orthogonal Frequency Division Multiplexing (OFDM) symbols per subframe). For the cases where different options were considered, Table 10.5 provides a range of figures.

Table 10.5 Outcome of the 3GPP self-evaluation of technical requirements (ITU-R 2009a). © 2009 European Telecommunications Standards Institute.

img

All the details of the 3GPP self-evaluation can be found in (3GPP 2099a). However, some further information about the results of the 3GPP cell spectral efficiency and cell-edge spectral efficiency evaluations is given here. In the DL, 3GPP performed evaluations for different PDCCH overhead assumptions and for a few antenna setups and transmission schemes, for example, Single-User (SU)-Multiple-Input Multiple-Output (MIMO), Multi-User (MU)-MIMO, CS/CB-CoMP, and JP-CoMP. In most cases, four base station transmit antennas were utilized but a few evaluations were also performed assuming eight transmit antennas. FDD and TDD perform relatively similar in most cases, except for JP-CoMP in the base coverage urban scenario where the TDD option seems to take advantage of the available channel reciprocity. Even with the higher PDCCH overhead of L = 3 OFDM symbols both the FDD and the TDD RITs fulfill the DL requirements, which are given in Table 10.4. The most challenging test environment seems to be the base coverage urban scenario.

In the Uplink (UL), most of the spectral efficiency and cell-edge spectral efficiency evaluations were performed using Rel-8 Single Input Multiple Output (SIMO) transmission, Rel-8 MU-MIMO or SU-MIMO (2x4). As in the DL, the evaluations show that LTE Rel-8 (and consequently Rel-10) fulfills all the requirements in all test environments and the performance of the FDD RIT and the TDD RIT is very similar. An observation is that, in the UL, the 3GPP self-evaluation indicates that all requirements are also fulfilled with LTE Rel-8 technology.

10.3.2 Simulative Performance Assessment by WINNER+

This section presents the LTE-A assessment made by the WINNER+ group. First the cell and cell-edge spectral efficiencies are addressed for the different test environments for UL, DL, FDD RIT and TDD RIT. Next, the performance evaluation results related to mobility are shown. The traffic channel link data rates and mobility classes are considered. Finally, the VoIP capacity results are presented. The detailed results as well the simulation assumptions are described in (WINNER+ 2010). Furthermore the analytical and inspection performance assessment by WINNER+ can be found in Appendices E.1.2 and E.1.1, respectively.

Cell and Cell Edge Spectral Efficiency

The requirements and simulation results for the cell spectral efficiency are summarized for the FDD and TDD RIT in Table 10.6. Cell-edge spectral efficiency results are summarized in Table 10.7. These results are averaged using different simulators and the same antenna configurations. The simulations were conducted using the common set of parameters summarized in Table 10.8. Detailed simulation results and assumption are described in (WINNER+ 2010). Simulation results show that the requirements (see Table 10.4) are achieved for all environments by using different MIMO features of LTE-A.

Table 10.6 Cell spectral efficiency (CSE) results for FDD and TDD RIT.

img

Table 10.7 Cell edge spectral efficiency (CESE) results for FDD and TDD RIT.

img

Table 10.8 Models and assumptions.

Parameter Value
Bandwidth 10 MHz DL + 10 MHz UL for FDD, 10 MHz for TDD, double bandwidth for InH.
Scheduler DL: Proportional Fair in Time and Frequency.
Receiver type MMSE with intercell interference suppression capabilities in DL and UL.
Network synchronization Synchronized.
Antenna configuration at base station (a) Uncorrelated co-polarized (used for InH DL/UL and UMi DL baseline): co-polarized antennas separated four wavelengths.
(b) Correlated co-polarized (used otherwise) 0.5 wavelengths between antennas.
Antenna configuration at UE Vertically polarized antennas 0.5 wavelengths separation at UE.
Channel estimation Nonideal channel estimation.

Mobility

The channel link data rates for FDD and TDD RIT for different environments are shown and compared to the ITU-R Requirements in Table 10.9. The results for FDD are the average of results using different assumptions, so that the mean value does not represent the performance of one particular system setup. The obtained results show that the requirements are fulfilled. The support of mobility classes for the TDD and FDD RITs are summarized in Table 10.10 and the requirements are also achieved for all the environments.

Table 10.9 Traffic channel link data rates for FDD and TDD RIT.

img

Table 10.10 Mobility class support for FDD and TDD RIT.

img

VoIP Capacity

The VoIP capacity simulation results for FDD RIT and TDD RIT are described in Table 10.11. Different assumptions were made in the underlying simulations, so that the mean values do not represent the performance of one particular system setup. It is shown that the requirements are achieved for all environments.

Table 10.11 VoIP capacity for FDD and TDD RIT.

img

10.3.3 LTE-Advanced Performance in the Rural Indian Open Area Scenario

The ITU-R defined evaluation criteria but was open for adding new evaluation environments and this opportunity was exploited in ITU-R (2010a), which proposed an additional deployment scenario that is referred to as the so-called Indian Rural environment. The “Rural Indian Open Area” deployment scenario assumes a large-cell coverage. The parameters of the scenario like carrier frequency, UE antennas height, intersite distance, etc., may take several different values, for example, the intersite-distance is 30 km to 50 km which corresponds to typical distances between villages in India. In this scenario UEs are in fixed positions with rooftop directional antennas.

The “Rural Indian Open Area” scenario is interference limited, so there is very limited sensitivity to frequency bands, antenna heights, and intersite distance. In Table 10.12 system performance in this scenario, as simulated by WINNER+, is summarized and compared with other scenarios. The results obtained by TCOE India and WINNER+ evaluation groups confirm that the requirements for this test scenario are met.

Table 10.12 LTE-A performance in rural Indian open area compared with performance in other test scenarios.

img

10.4 Channel Model Implementation and Calibration

10.4.1 IMT-Advanced Channel Model

The IMT-Advanced channel model defined by the ITU-R is a stochastic and geometric model that allows creating a bidirectional radio channel ITU-R (2009c). Although it is a geometric-based model that knows the exact location of transmitting and receiving elements, it does not specify the position of the scatterers. Only ray directions are known.

Figure 10.3 shows a transmitter and receiver and all existing rays between them. Moreover, this figure represents the concept of cluster, or propagation path – in space, time and angle – that consists of a set of rays affected by nearby scatterers. The figure also includes the concept of Angle of Departure (AoD) and Angle of Arrival (AoA), both at cluster and ray level.

Figure 10.3 IMT-Advanced MIMO channel

img

Being a stochastic model, channel parameters are derived from a set of statistical distributions obtained from channel measurements. These measurements were made in extensive trials covering different propagation scenarios in Line of Sight (LoS) and Non Line of Sight (NLoS) conditions. Statistical distributions are defined, for example, in terms of delay, Delay Spread (DS), Shadow Fading (SF) and crosspolarization ratio.

In the IMT-Advanced channel model there are two different sets of channel parameters. The first one is related to the large-scale parameters, such as SF and path loss. The second one concerns small-scale parameters, including angle of arrival and departure of the rays.

In order to generate channel samples between one transmitter and one receiver, mobility and exact location of both ends must be known. Based on this, all large-scale parameters are generated, followed by the small scale parameters. All these parameters are kept constant during the whole simulation. Channel samples are obtained by adding the contributions of all involved rays.

As explained in section 10.1.2, there are five different scenarios and thus channel models: indoor hotspot, urban microcell, urban macrocell, suburban macrocell and rural macrocell. Different models are characterized by different parameters of the statistical distributions used to generate the channel samples. Table 10.13 summarizes the main values of these distributions for each model, distinguishing among the following sight conditions: LoS, NLoS and Outdoor to Indoor (OtoI).

Table 10.13 Parameters of the IMT-Advanced channel model.

img

The first parameter, DS, is the delay spread. For each link, a normal random number must be generated with mean μ and standard deviation σ. The resulting number is the delay spread in logarithmic units (dB). Note that DS is, in linear scale, in the order of 10−7s, which is well aligned with the classical tapped delay line models. The lower values of DS are for the Urban Microcell (UMi) model, whereas highest ones correspond with Rural Macrocell (RMa).

Angle Spread Departure (ASD) and Angle Spread Arrival (ASA) parameters measure the ray dispersion between the output of the transmitter and the input of the receiver, respectively. Concerning SF values, these are similar for all scenarios and higher in case of NLoS. Finally, the K factor applies only for LoS as it represents the power ratio among line-of-sight ray and the others.

Coordination between evaluation groups was strongly recommended by ITU-R to facilitate comparison and consistency of results and to ease the understanding of differences in evaluation results achieved by the independent evaluation groups. Indeed, the divergence in the results obtained in the evaluation of the same system is a common problem encountered in all fora where researchers coming from different bodies try to provide their contributions to the progress of science and technology. A possibility for overcoming this situation is the comparison of different approaches using the same calibration process and benchmark data. In this framework, and in order to simplify the IMT-Advanced assessment, ITU-R approved a number of documents describing the evaluation process, requirements and evaluation criteria. In particular, ITU-R (2009c) contains the detailed simulation assumptions and the evaluation methodologies of IMT-Advanced. The M.2135 document represents a significant calibration effort that intends to ensure proper harmonization of the tools used by the external evaluation groups for performance evaluation of the IMT-Advanced technologies.

ITU-R (2009c) is mainly focused on the definition of the reference scenarios for system-level simulations including large- and small-scale parameters of the channel model. The new stochastic geometric model proposed by the ITU-R is far from being simple to implement. Several implementations were freely offered, but the main problem is that these implementations are not coherent and do not provide the same output statistics. Without a proper calibration of the channel model implementation it would have been not feasible to build up a consistent evaluation of candidates. This is why channel calibration has been so important from the beginning.

The following subsections intend to describe the channel model calibration effort made in WINNER+ (2010). Moreover, this action aimed to share the experience, information and benchmark data with other evaluation groups in order to foster the required coordination and unification of results.

The calibration data presented in this section focuses on one scenario, the Urban Macrocell (UMa), serving as an example of the general procedure. More information and results can be found at the WINNER+ IMT-Advanced evaluation web page.1 Calibration is split in two main parts: large-scale and small-scale parameters.

10.4.2 Calibration of Large-Scale Parameters

For large-scale calibration, multicell system level simulations are required. Given that this kind of simulator is also to be used for evaluating the IMT-Advanced requirements of cell spectral efficiency, cell-edge user throughput, VoIP capacity and mobility, large-scale parameter calibration is also useful in the sense of detecting potential simulator incoherences among contributors. Some important properties of the system simulations are determined by the environment description in ITU-R (2009c), including the propagation and channel models. The metrics used in this calibration are the path gain and the wideband Signal-to-Noise-plus-Interference Ratio (SNIR), which are essentially technology independent, and hence calibration of these metrics can be performed using just a few additional assumptions compared to what is given in (ITU-R 2009c). The path gain is defined as the average signal attenuation between a UE and its serving base-station cell. The measure includes distance attenuation, shadowing and antenna gains (both at the base station and at the UE) while the effects from fast fading are excluded. The path gain may hence be defined as the difference between the (average) received power and the (average) transmitted power.

(10.1)equation

The DL wideband SNIR is the (average) power received from the serving cell in relation to the (average) received power from all other cells plus noise. For a UE connected to the base station cell i the geometry (G) is defined as:

(10.2)equation

where img is the received power from the base station cell j and img is the noise power.

In addition to the evaluation principles and assumption in ITU-R (2009c) and the channel model clarifications that followed, Table 10.14 summarizes other assumptions that have been used to derive the path gain and wideband SNIR distributions.

Table 10.14 Large-scale assumptions.

img

Figure 10.4 presents calibration results from up to seven WINNER+ partners. Some very small differences appear in terms of path gain, but these differences decrease when considering the wideband SNIR distribution.

Figure 10.4 Path gain and wideband SNIR distributions in the UMa scenario

img

10.4.3 Calibration of Small-Scale Parameters

In the small-scale fading characteristics the delay spread and the angular spread at the base station and at the UE are included. For simplicity, the small-scale fading calibration assumes omnidirectional antennas at both the base station and the UE. If other antenna patterns are assumed, for example a directional antenna pattern at the base station, the results will be different. Moreover, the calibrations are performed separately for LoS, NLoS and OtoI propagation conditions. OtoI propagation is relevant only in the UMi scenario. For calibration of the angular spread for LoS propagation channels it is important to account for the correction under ITU-R (2009b) Section 3. Now assume that each propagation channel comprises N clusters and that each cluster comprises M rays. Assume further that the delay of ray m in cluster n is denoted img and that the associated power is denoted img. In case of LoS propagation the LoS ray is here included as a separate cluster for which, according (ITU-R 2009c), only the first ray in the cluster has a non-zero power.

To calculate the delay spread, the average delay is first calculated according to:

(10.3)equation

Then, the Root-Mean-Square (RMS) delay spread img is calculated as follows:

(10.4)equation

For the angular spread we use the circular angular spread img as defined in (3GPP 2009c, Annex A), where the angular spread is the minimum spread over different linear shifts Δ. One small addition is used here, however. Before calculating img we wrap the quantity img into the interval img according to Equation (10.5). This step is not explicitly stated in (3GPP 2009c).

(10.5) equation

The RMS delay spread (img) and the circular angular spread (img) at the base station and at the UE are calculated for a large number of radio links and in the calibrations depicted in Figure 10.5 the corresponding distributions are compared.

Figure 10.5 RMS delay spread for UMa NLoS (left plot) and LoS (right plot)

img

10.5 Simulator Calibration

System-level simulator calibration is the process that aims the alignment of the simulation tools. The alignment can be achieved by changes to the simulator driven by the feedback acquired from the comparison with results from other simulator tools or literature results. The purpose of the calibration is to be sure that the different simulation tools produce comparable outputs.

A review of system-level simulator building blocks shows that calibration is not a trivial task due to the system complexity and the high number of degrees of freedom in modeling and implementation. In general, system-level simulations are focused on modeling the operation of network components based on the performance of multiple links between network nodes and UEs. Resource management is to be modeled as well as effects of radio resources limitations in terms of capacity and coverage. Typical system-level simulators (Döttling et al. 2009, Chapter 12) comprise the following main building blocks:

  • user and network deployment modules;
  • radio channel model;
  • link-to-system interface;
  • traffic generator;
  • scheduler.

In practice, there is always room for differentiation of any given module operation modeling. Deviations can originate from partial standardization that requires taking additional, sometimes implicit, assumptions or can originate from bug detection in implementation. In order to minimize differences in simulators, the use of particular system configurations with stable, well defined features is preferred. During the IMT-Advanced evaluation process, for example, the LTE Rel-8 basic configuration (3GPP 2010b) served as such a reference. It is important to understand that the priority for the calibration step is not to achieve the best performance figures but to obtain results aligned among calibration partners.

System-level calibration is reasonable if there is confidence that the employed channel model is already calibrated. On top of this confidence there is an opportunity to perform tests that involve as many simulator building blocks as possible to achieve high coverage of the calibration range.

When planning the calibration there is also a need to decide on indicators that should be compared during the process. It is reasonable to choose performance indicators that are most important from user experience point of view. The normalized user throughput is an example of metric to compare.

The claim that results come from a calibrated simulator is true when this simulator was calibrated in a particular deployment scenario. Claim of calibration in one deployment scenario does not automatically mean calibration in any other deployment scenario.

The WINNER+ Evaluation group performed a calibration of the available simulation tools following the aforementioned approach. The simulators were calibrated in UL and DL. In Figure 10.6 an example of a calibration result for UL Urban Micro deployment scenario ITU-R (2009c) is shown.

Figure 10.6 CDF of normalized user throughput in UL UMi deployment scenario. Results originated from three organizations

img

Numerical characterization of calibration accuracy might help to track progress towards achieving simulator alignment and also allows indicating good and weak calibrated scenarios. Quantitative characterization of calibration accuracy is one of the factors that can support a decision to recognize simulators as calibrated. However, it is difficult to provide a rationale for a classifier based on a partcular numerical limit, which will indicate the state of calibration or noncalibration. The decision on an exact numerical value that will classify simulators as calibrated or not would be arbitrary. That is why a quantitative characterization of calibration accuracy is only an element of support for the final decision which is done by the researcher taking multiple indicators into account.

10.6 Conclusion and Outlook on the IMT-Advanced Process

As a result of the external IMT-Advanced candidate evaluations, there were 13 independent evaluation reports apart from the WINNER+ report (ITU-R 2010b) submitted to ITU-R. The WINNER+ simulation results provided in this chapter have confirmed that the 3GPP LTE Rel-10 proposal satisfies all the IMT-Advanced requirements. In fact, despite differences in implementation details, all reports confirmed the proponents' claims that both LTE-A and WirelessMAN-Advanced (the WiMAX-based proposal, part of the IEEE 802.16 standard) fulfill all ITU-R requirements for IMT-Advanced systems.

Taking into account the evaluation reports, ITU-R WP 5D decided in October 2010 that both submitted IMT-Advanced system proposals, LTE-A and WirelessMAN-Advanced, have successfully met all of the criteria established by ITU-R for the first release of IMT-Advanced, qualifying them as the first “true 4G systems” (ITU-R 2010c).

These two technologies will now enter the final stage of the IMT-Advanced process (step 8 of the list given at the beginning of this chapter), that is, the development of an ITU-R Recommendation specifying the in-depth technical standards for these radio technologies. This final step is expected to be completed in early 2012.

Notes

1. http://projects.celtic-initiative.org/winner+/.

References

3GPP 2009a Further Advancements for E-UTRA (LTE-Advanced). Technical Specification Group Radio Access Network 36.912 v9.3.0, 3rd Generation Partnership Project (3GPP), http://www.3gpp.org/ftp/Specs/html-info/.

3GPP 2009b Requirements for Evolved UTRA (E-UTRA) and Evolved UTRAN (E-UTRAN). Technical Specification Group Radio Access Network 36.913 v9.0.0, 3rd Generation Partnership Project (3GPP), http://www.3gpp.org/ftp/Specs/html-info/.

3GPP 2009c Spacial channel model for Multiple Input Multiple Output (MIMO) Simulations. Technical Specification Group Radio Access Network 25.996 v9.0.0, 3rd Generation Partnership Project (3GPP), http://www.3gpp.org/ftp/Specs/html-info/.

3GPP 2010a Evolved Universal Terrestrial Radio Access (E-UTRA) and Evolved Universal Terrestrial Radio Access Network (E-UTRAN); Overall description; Stage 2. Technical Specification Group Radio Access Network 36.300 v9.5.0, 3rd Generation Partnership Project (3GPP), http://www.3gpp.org/ftp/Specs/html-info/.

3GPP 2010b LTE; Evolved Universal Terrestrial Radio Access (E-UTRA); Further Advancements for E-UTRA Physical Layer Aspects. Technical Specification Group Radio Access Network 36.814 v9.0.0, 3rd Generation Partnership Project (3GPP), http://www.3gpp.org/ftp/Specs/html-info/.

Döttling M, Mohr W and Osseiran A 2009 Radio Technologies and Concepts for IMT-Advanced. John Wiley & Sons, Ltd, Chichester.

ITU-R 2003 Framework and overall objectives of the future development of IMT-2000 and systems beyond IMT-2000. Recommendation ITU-R M.1645-E, International Telecommunications Union Radio (ITU-R), http://www.itu.int/rec/R-REC-M.1645/en.

ITU-R 2008a Background on IMT-Advanced. Document IMT-ADV/1-E, International Telecommunications Union Radiocommunication (ITU-R) Study Groups Working Party 5D, http://www.itu.int/md/R07-IMT.ADV-C-0001/en.

ITU-R 2008b Invitation for submission of proposals for candidate radio interface technologies for the terrestrial components of the radio interface(s) for IMT-Advanced and invitation to participate in their subsequent evaluation. Circular Letter 5/LCCE/2, International Telecommunications Union Radiocommunication (ITU-R) Study Groups Working Party 5D, http://www.itu.int/md/R00-SG05-CIR-0002/en.

ITU-R 2008c Requirements related to technical performance for IMT-Advanced radio interface(s). Report ITU-R M.2134, International Telecommunications Union Radio (ITU-R), http://www.itu.int/publ/R-REP/en.

ITU-R 2009a Acknowledgement of candidate submission from 3GPP proponent (3GPP organization partners of ARIB, ATIS, CCSA, ETSI, TTA AND TTC) under Step 3 of the IMT-Advanced process (3GPP technology). Document IMT-ADV/8-E, International Telecommunications Union Radiocommunication (ITU-R) Study Groups Working Party 5D, http://www.itu.int/md/R07-IMT.ADV-C-0008/en.

ITU-R 2009b Correction of typographical errors and provision of missing texts of IMT-Advanced channel models in Report ITU-R M.2135. Document IMT-ADV/3-E, International Telecommunications Union Radiocommunication (ITU-R) Study Groups Working Party 5D, http://www.itu.int/md/R07-IMT.ADV-C-0003/en.

ITU-R 2009c Guidelines for Evaluation of Radio Interface Technologies for IMT-Advanced. Report ITU-R M.2135-1, International Telecommunications Union Radio (ITU-R), http://www.itu.int/publ/R-REP/en.

ITU-R 2010a Evaluation of IMT-Advanced Candidate Technology Submissions in Documents IMT-ADV/4 and IMT-ADV/8 by TCOE India. Document IMT-ADV/16-E, International Telecommunications Union Radiocommunication (ITU-R) Study Groups Working Party 5D, http://www.itu.int/md/R07-IMT.ADV-C-0016/en.

ITU-R 2010b Evaluation of IMT-Advanced Candidate Technology Submissions in Documents IMT-ADV/6, IMT-ADV/8 and IMT-ADV/9 by WINNER+ Evaluation Group. Document IMT-ADV/22-E, International Telecommunications Union Radiocommunication (ITU-R) Study Groups Working Party 5D, http://www.itu.int/md/R07-IMT.ADV-C-0022/en.

ITU-R 2010c ITU Paves Way for Next-generation 4G Mobile Technologies – ITU-R IMT-Advanced 4G Standards to Usher New Era of Mobile Broadband Communications. Press release, International Telecommunications Union Radio (ITU-R), http://www.itu.int/net/pressoffice/press_releases/2010/40.aspx.

Network GTSGRA 2008a IMT-Advanced Workshop, Shenzhen, http://www.3gpp.org/ftp/workshop/2008-04-07_RAN_IMT_Advanced/.

Network GTSGRA 2008b LTE-Advanced Workshop, Prague. Available online at http://www.3gpp.org/ftp/tsg_ran/TSG_RAN/TSGR_40/LTE-Advanced workshop/.

WINNER+ 2010 Celtic Project CP5-026 Final conclusions on end-to-end performance and sensitivity analysis (ed. Werner M.) Public Deliverable D4.2, Wireless World Initiative New Radio – WINNER+, June 2010, http://projects.celtic-initiative.org/winner+/index.html.

WINNER-II 2008 IST-4-027756 WINNER II Channel Models. Public Deliverable D1.1.2, Wireless World Initiative New Radio – WINNER, April 2008, http://projects.celtic-initiative.org/winner+/index.html.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.119.118.232