Using Simulated Annealing to Solve Telecommunication Network Design Problems

 


Marcus Randall, Graham McMahon, Steve Sugden

School of Information Technology
Bond University, Robina, Australia


 

back to menu


Abstract

Simulated Annealing is a robust optimisation algorithm. We apply this technique to solve a minimum cost network synthesis problem, which is common in the design of telecommunication networks. The formulation we use models a number of practical problems with hop-limit, degree and capacity constraints. Our solution strategy consists of building and modifying a set of routes that form the network. Using this approach, we solve moderately large networks (up to 30 nodes) efficiently. Another important aspect of this work is the exploration of the method of application of local search transition operators to this problem.

Keywords: Simulated Annealing, network synthesis problem

Introduction

This paper investigates the solving of a network synthesis problem with the meta-heuristic search technique known as simulated annealing (SA) (van Laarhoven and Aarts 1987). This problem is common in the design of efficient telecommunications networks. The aim of the problem is to satisfy all traffic requirements at minimum cost between a set of nodes. For the network synthesis problem involving n nodes, there are nn-2 possible topologies, e.g. one billion possibilities for a network as small as 10 nodes. The information required to formulate the problem is the traffic demand between each origin and destination (O-D) pairs, and the cost function for carrying traffic on each (possible) link between nodes i and j.

Recent interest in the problem has been in the application of Genetic Algorithms (GAs). Work by Berry, Murtagh, Sugden and McMahon (1995,1997) has demonstrated this approach in which a GA is used to synthesise the network while a Linear Program (LP) is used to allocate the traffic amongst the routes of the network. However, this approach has the potential to be inefficient due to the typically large LP component (though no empirical investigation was undertaken in Berry et al. (1995,1997)). A new approach by Randall, McMahon and Sugden (1999) has used SA to solve the problem as a modified knapsack problem with very encouraging results. In this work, routes were precomputed and selected to form the network. Other solution techniques have been described in Balakrishnan, Magnanti, Shulman and Wong (1991), Gavish (1985), Kershenbaum and Peng (1986), Minoux (1989) and Sharma, Mistra and Bhattacharji (1991).

Formulations

There are numerous ways that the network synthesis problem can be formulated mathematically. We first discuss a traditional linear programming approach that has been used in the literature (Berry et al. 1995,1997). From this, a model is developed that represents the network as a set of routes, the individual nodes of which can be altered using local search operators.

Let G be the set of all undirected graphs on n nodes with G a member of this set. We represent G by its upper triangular node-node adjacency matrix B with elements bij. The problem is to find a member G* which minimises the cost of transporting required origin-destination (O-D) flows subject to specified link, node capacity, node degree and chain hop-limit constraints. The total bandwidth (flow) requirement on virtual path connections between O-D pair p-q is given by Fpq (without loss of generality represented as an element of an upper triangular matrix). The partial flow along the rth route between node p and node q is denoted by and is the cost per unit flow on this route.

(1)

s.t.

(2)

(3)

(4)

(5)

(6)

(7)

(8)

Where:

n is the number of nodes.

is the cost per unit flow on route r between O-D pair p-q.

Fpq is the total bandwidth (flow) requirement between O-D pair p-q.

is 1 if the link p-q exists on route r between nodes i,j, 0 otherwise.

is the upper bound on the available capacity (total flow) on the link i,j.

is the upper bound on total flow at node i. This flow comprises traffic originating at, terminating at and traversing node i.

Hmax is the upper bound imposed on the number of links in a route (the hop limit).

is the amount of traffic routed between p and q on route r.

Equations 1 to 8 represent the model of the network synthesis problem. Equation 1 is the objective function in which the traffic costs along the routes is minimised. Equation 2 is used to calculate the total flow on each link. Constraint 3 ensures that the bandwidth is distributed among the appropriate routes. Constraint 4 ensures that the node capacities are not exceeded while 5 ensures that the link capacities are not exceeded. The hop-limit is preserved in 6 and the constraints that ensure that the nodes meet the node degree constraints are modeled in 7.

2.1 A Linked List Formulation

The solution is represented using a linked list approach (the concept for which was originally developed by Randall (1999) and Randall and Abramson (1999)). A list of list structures is used in which each sub-list represents an individual route of the network. Each route (sub-list) contains an ordered list of nodes. An example of a solution is given in Figure 1.

Figure 1: A list based solution and the network it represents.

Allocating Traffic Amongst Routes

The SA algorithm is used to synthesise a suitable network topology. As there may be more than one route per O-D pair, a subproblem is to determine an appropriate allocation of traffic for each route. There are two broad methods of achieving this: exactly - the optimal allocation of traffic to routes is determined; and heuristically - the allocation is determined by a special-purpose algorithm. The first approach has been used in Berry et al. (1995,1997), however it can be time consuming for medium to large size problems. In contrast, the heuristic approach does not seek the optimal allocation, but one that satisfies Equation 3. We have adopted the latter approach in order to produce good solutions within a reasonable amount of computational time.

The heuristic we employ is simple yet very efficient in terms of the computational time required to perform it (Randall 1999). In essence, for each O-D pair and its set of routes (those in the current solution), the algorithm proportionally loads traffic on each route according to its cost. Cheaper routes will therefore be given a greater amount of traffic than more expensive routes. Further study could examine the relationship between the complexity of the heuristic and the effect on the objective function.

Implementation

In this section, the implementation of the solver is described. This programme is coded in the C language and consists of three important aspects, the SA engine, the generation of initial feasible solutions and the nature and application of the local search operators.

4.1 SA Algorithm

The SA search engine implements Connolly's (1990) Q8-7 cooling schedule as it has been shown to be quite successful (Abramson and Randall 1999; Connolly 1990; Randall 1999; Randall and Abramson 1999). The cooling schedule is based on reheating the temperature a number of times throughout the search. Both the number of times that this reheating occurs as well as the interval between reheats can be altered.

4.2 Generating Initial Feasible Solutions

As the problem is highly constrained, it can be difficult to form an initial feasible solution. However, the approach we have adopted consists of two phases. In the first phase, an initial solution is constructed so that at least one route per O-D pair (with non-zero demand requirements) is present.

The second phase attempts to modify the solution so that it becomes feasible. This is achieved by using SA in order to minimise the amount of constraint violation. When this becomes 0, a feasible solution has been found. The degree of violation is calculated according to the relational operator of each constraint. For instance, if the sign of the constraint is £ and the left hand side is larger than the right hand side, the net difference is the amount of constraint violation. The constraint violation of the other signs are calculated in a similar manner.

4.3 Local Search Transition Operators

The most appropriate local search transition operators for the network synthesis problem are:

Move Node: A node is moved from one route to another.

Swap Node: Two nodes in the same or different routes swap positions.

Add Node: A node is added to the end of a route.

Drop Node: A node is dropped from a route.

Add Route: An entire route is added. The route is randomly generated, but obeys hop-limit and node uniqueness constraints.

Drop Route: An entire route is removed.

Using these operators, it is possible to explore multiple neighbourhoods in the course of solving a particular problem. In order to select a transition operators for each iteration of SA, the probabilistic approach outlined in Randall (1999) is used. In this system, each transition operator is assigned a probability. The probability space is partitioned into a number of sections (using these probabilities as bounds) that represent each of the transition operators. A uniform random number is generated at each SA iteration to determine the transition operator to be used.

Different probability settings will effect the performance of the SA solver. As such, a range of transition operator probability sets will be used to test the solver. Table 1 shows the probability settings used. These settings coarsely represent the probability space. Only five sets are used due to the computational effort required to evaluate each set.

 

Table 1 : Transition probability settings. Note: The set number is used to uniquely identify the transition operator set.

Set

Add

Drop

Move

Swap

Add Route

Drop Route

1

2

0.2

0.2

0.2

0.2

0.1

0.1

3

0.3

0.3

0.1

0.1

0.1

0.1

4

0.1

0.1

0.3

0.3

0.1

0.1

5

0.1

0.1

0.1

0.1

0.3

0.3

Computational Experience

A set of problem instances have been generated with which to test the SA solver (see Table 2). We have generated these instances with demand matrices that are 90% dense. This means that 90% of the O-D pairs have non-zero traffic requirements. Subsequently this should make it difficult for SA to solve these problems. These problems (as well as a problem generator) are available at http:\\www.it.bond.edu.au\randall\dynamtele.exe.

The computing platform used to perform the experiments is an IBM SP2 consisting of 22 RS6000 model 590 processors with a peak performance of 266 MFLOPs per node.

 

Table 2: Problem instances used in this study.

Name

Size(Nodes)

tele5-1

5

tele5-2

5

tele10-1

10

tele10-2

10

tele20-1

20

tele20-2

20

tele30-1

30

tele30-2

30

The results are given in Table 3. In this table, the minimum, average and maximum costs are recorded for each problem and each transition operator set. For the problems up to size 20 nodes, at most 2 hours of computational time was available for each run. The 30 problems were given up to 8 hours per run.

 

Table 3: Computational results.

Problem

Set

Cost

   
   

Minimum

Average

Maximum

t5a1

1

958

1073.6

1194

 

2

956

1025.7

1092

 

3

957

1017.6

1069

 

4

964

1072.2

1161

 

5

1030

1117.9

1161

t5a2

1

718

769.7

831

 

2

715

743.8

798

 

3

664

737

827

 

4

710

779.4

798

 

5

710

769.7

837

t10a1

1

3595

3870

4049

 

2

3528

3921.1

4180

 

3

3333

3694.3

3963

 

4

3431

3890.8

4355

 

5

3611

3881.4

4060

t10a2

1

1843

2061.7

2338

 

2

1805

2060.3

2244

 

3

1808

2028.7

2199

 

4

1910

2047.7

2237

 

5

1978

2152.2

2337

t20a1

1

10028

10719.3

11278

 

2

9336

9901

10495

 

3

9488

9862.9

10358

 

4

9537

10252.9

11140

 

5

11377

11680.8

12067

t20a2

1

11170

11640.2

12303

 

2

10544

11298.6

12054

 

3

10974

11326.3

11870

 

4

11969

12464

13039

 

5

12183

12804.2

13604

t30a1

1

8345

8489.1

8687

 

2

7821

8402.8

8643

 

3

7760

8167.6

8415

 

4

7615

8566.9

8764

 

5

8638

8713.86

8817

t30a2

1

10183

10297.4

10483

 

2

10110

10273

10381

 

3

9910

10047.22

10333

 

4

10247

10434.8

10567

 

5

10417

10447.8

10518

In order to analyse this data, we use a one-way ANOVA (ANalysis Of VAriance) procedure. This test is applied in order to determine whether there are consistent significant differences between each of the transition probability sets. As a highly significant result has been obtained p=.000 at a =0.05, this indicates that the performance of the solver due to the transition operator sets varies significantly. Using post-hoc analysis, we can establish a rank for each transition operator set. This is given in Table 4.

 

Table 4: Ranking of each transition operator set. An equal ranking signifies that there were no significant differences between the sets.

Rank

Set

1

3

1

2

2

1

2

4

3

5

From the results in Table 4, it can be seen that sets 2,3 were the best overall, while set 5 was the worst. This indicates that transition operators sets that place more emphasis on local search operators that perturb the solution the least (such as add and drop), were more successful. As set 5 used add route and drop route transition extensively, its overall performance was poor. This area of the application of local-search based transition operators to meta-heuristic techniques warrants further investigation, as performance varies significantly. This phenomenon has been noted in Randall (1999).

Conclusions

We have shown that SA is capable of solving a complex network synthesis problem with difficult constraints. This problem has applications in the domain of telecommunication network design.

The results indicate that the use of local search transition operators that make small perturbations to the solution give improved performance over those that make large changes. The transition operators that are responsible for large perturbations in the search space are add route and drop route as whole routes are created and incorporated into the network or removed completely respectively. In contrast, transitions add and drop merely extend or shorten an individual path by one link. Therefore, these results give empirical support to the notion that small changes over many iterations can produce good quality solutions. It must be noted however, that the use of larger transition operators is necessary otherwise parts of the solution space may become inaccessible and hence unexplored.

The next stage of our research will focus on developing other meta-heuristic strategies such as tabu search (Glover and Laguna 1997), GRASP (Feo and Resende 1995) and ant search (Dorigo Maniezzo and Colorni 1996) to solve this problem. We are also investigating alternative methods for allocating traffic to routes as this may provide improvements to overall solution quality.

Acknowledgement

The authors wish to thank the Queensland Parallel Supercomputing Facility for the use of the IBM SP2 computer.

References

  • Abramson, D. and Randall, M. (1999), A Simulated Annealing Code for General Integer Linear Programs, Annals of Operations Research, 86, 3-21.
  • Balakrishnan, A., Magnanti, T., Shulman, A. and Wong R. (1991), Models for Planning Capacity Expansion in Local Access Telecommunication Networks, Annals of Operations Research, 33, 239-284.
  • Berry, L., Murtagh, B., Sugden, S. and McMahon, G. (1995), Application of a Genetic-based Algorithm for Optimal Design of Tree-structured Communications Networks, Proceedings of the Regional Teletraffic Engineering Conference of the International Teletraffic Congress, South Africa, 361-370.
  • Berry, L., Murtagh, B., Sugden, S. and McMahon, G. (1997), An Integrated GA-LP Approach to Communication Network Design, Proceedings of the 2nd IFIP Workshop on Traffic Management and Synthesis of ATM Networks, Canada, 20-28.
  • Connolly, D. (1990), An Improved Annealing for the QAP, European Journal of Operational Research, 46, 93-100.
  • Dorigo, M., Maniezzo, V. and Colorni, A. (1996), The Ant System: Optimization by a Colony of Cooperating agents, IEEE Transactions on Systems, Man and Cybernetics, 26, 1-13.
  • Feo, T., and Resende, M. (1995), Greedy Randomised Search Procedures, Journal of Global Optimization, 51, 109-133.
  • Gavish, B. (1985) Augmented Lagrangian based Algorithms for Centralised Network Design, IEEE Transactions on Communications, 33, 1247-1257.
  • Goldberg, D. (1989), Genetic Algorithms in Search, Optimization and Machine Learning, Addison- Wesley Publishing Company, Reading, Massachusetts.
  • Glover, F. and Laguna, M. (1997), Tabu Search, Kluwer Academic Publishers, Boston, Massachusetts.
  • Kershenbaum, A. and Peng, S. (1986), Neighbor Finding Algorithms for CMST Customer Calculations, IEEE INFOCOM, Miami, Florida.
  • Minoux, M. (1989), Network Synthesis and Optimum Network Design Problem: Models, Solution Methods and Applications, Networks, 19, 313-360.
  • Randall, M. (1999), A General Modelling System and Meta-heuristic based Solver for Combinatorial Optimisation Problems, PhD Thesis, Faculty of Environmental Sciences, Griffith University.
  • Randall, M. Abramson, D. (1999), A General Meta-heuristic Based Solver for Combinatorial Optimisation Problems, Technical Report TR 99-01, School of Information Technology - Bond University, Submitted to the Journal of Computational Optimization and Applications.
  • Randall, M., McMahon, G. and Sugden, S. (1999), A Simulated Annealing Approach to Communication Network Design, Technical Report TR 99-04, School of Information Technology - Bond University, Submitted to the Journal of Combinatorial Optimization.
  • Sharma, U., Mistra, K. and Bhattacharji, A. (1991), Applications of an Efficient Search Techniques for Optimal Design of Computer Communication Networks, Micro Electronics and Reliability, 31, 337-341.
  • van Laarhoven, L. and Aarts, E. (1987), Simulated Annealing: Theory and Applications, D Reidel Publishing Company, Dordecht