The Map Generalization Capabilities Of Arcgis Information Technology Essay

Data processing associated with Geographical Information Systems is so enormous. The information needed from this data actually varies for different applications. Specific details can be extracted, for instance resolution diminished, contours reduced, data redundancy eliminated or features on a map for which application is needed absorbed. This is all aimed at reducing storage space and representing details on a map with a larger scale accurately unto another with a much smaller scale. This paper presents a framework for the Map Generalization tools embedded in ArcGIS (A Geographical Information Systems Software by ESRI) as well as the algorithm each tool uses. Finally, a review of all the tools indicating which is more efficient after thorough analysis of the algorithm used and the desired output result produced.

1.0 Introduction

1.1 Definition of Map Generalization

As (Goodchild, 1991) points out, Map Generalization is the ability to simplify and show spatial [features with location attached to them] relationships as it is seen on the earth’s surface modelled into a map. The advantages involved in adopting this process cannot be overemphasized. Some are itemized below (Lima d’Alge J.C., 1998)

It reduces complexity and the rigours Manual Cartographic Generalization goes through.

It conveys information accurately.

It preserves the spatial accuracy as drawn from the earth’s surface when modelling

A lot of Software vendors came up with solutions to tackle the problem of manual cartography and this report will be reflecting on ArcGIS 9.3 Map Generalization tools.

1.2 Reasons for Automated Map Generalization

In times past, to achieve this level of precision, the service of a skilled cartographer is needed. He is faced with the task of modelling [representation of features on the earth’s surface] on a large scale map into a smaller scale map. This form of manual cartography is very strenuous because it consumes a lot of time and also a lot of expertise is needed due to the fact that the cartographer will inevitably draw all the features and represent them in a smaller form and also taken into consideration the level of precision required so as not to render the data/graphical representation invalid.

The setbacks experienced were the motivating factor for the advent or introduction to Automatic Cartographic Design which is known as Automated Map Generalization. A crucial part of map generalization is information abstraction and not necessarily to compress data. Good generalization technique should be intelligent which takes into consideration the characteristics of the image and not just the ideal geometric properties (Tinghua, 2004). Several algorithms [set of instructions taken to achieve a programming result] have been developed to enable this and this report is critically going to explore each of them

1.3 Process of Automated Map Generalization

As Brassel and Weibel (n.d.) Map Generalization can be grouped into five steps.

Structure Recognition

Process Recognition

Process Modelling

Process Execution

Display

The step that will be elaborated upon for the cause of this report will be “Process Recognition” [types of Generalization procedures] which involves different manipulation on geometry in order to simplify the shape and represent it on a smaller scale (Shea and McMaster, 1989)

2.0 Generalization Tools in ArcGIS 9.3

2.1 Smooth Polygon

This is a tool used for cartographic design in ArcGIS 9.3. It involves dividing the polygon into several vertices and each vertice being smoothed when the action is performed (FreePatentOnline, 2004-2010). An experiment is illustrated below to show how Smooth Polygon works.

Add the layerfile “Polygon” which has an attribute name of Huntingdonshire-which is a district selected from England_dt_2001 area shapefile that was downloaded from UKBorders. The next step was I selected the ArcTool Box on the standard toolbar of ArcMap, then I went to Generalization Tools which is under Data Management Tools and afterwards I clicked on Smooth Polygon. Open Smooth Polygon > Select Input feature (which is polygon to be smoothed) – in this case “Polygon” > select the output feature class (which is file location where the output image is to be saved) > select the simplification algorithm (which is PAEK) > select the simplification tolerance.

Fig 2.0: Display before Smooth Polygon Fig 2.1: Display after Smooth Polygon

The table in Fig 2.1 shows the output when Polynomial Approximation Exponential Kernel (Bodansky, et al, 2002) was used. The other algorithm that can be applied for this procedure is Bezier Interpolation.

Algorithm Type

Simplification Tolerance(Km)

Time Taken (secs)

PAEK

4

1

Bezier Interpolation

112

Observation

PAEK Algorithm: When this technique was used, as the simplification tolerance value is increased, the weight of each point in the image decreased and the more the image is smoothed. Also, the output curves generated do not pass through the input line vertices however, the endpoints are retained. A significant short coming of PAEK Algorithm is that in a bid to smoothen some rough edges, it eliminates important boundaries, to refrain from such occurrence a buffer is to be applied to a zone of certain width before allowing the PAEK Smooth algorithm to execute. (Amelinckx, 2007)

Bezier Interpolation: This is the other algorithm that can be applied to achieve Smoothing technique on polygons. In this case, the parameters are the same as PAEK’s except that the tolerance value is greyed out- no value is to be inputed and as a result the output image produced is identical to its source because the tolerance value is responsible for smoothen rough edges and the higher value stated, the more the polygon is smoothed. The output curves passes through the input line vertices. When this experiment was performed, it was noticed that its curves were properly aligned around vertices.

Read also  Role Of Data Structures In Programming Languages

Conclusion: After performing both experiments, it was observed that the PAEK Algorithm is better because it allows a tolerance value to be inputted which in turn gives you a more smoothed image around curves and this will be of more importance to cartographers that want to smoothen their image and remove redundant points.

2.2 Smooth Line

This is the second tool we will be examining. This is similar to Smooth Polygon technique except that the input feature will have to be a polyline shapefile. The steps are repeated as illustrated in Smooth Polygon but under Generalization Tools; Smooth Line is chosen. Now under input feature (select gower1) which is a dataset provided for use on this report. Specify the output feature > smoothing algorithm selected (PAEK) > smoothing tolerance.

Note: All other fields are left as defaults i.e. No_check/Flag Error – meaning we do not want it to display any errors if encountered and fixed_Endpoint/Not_fixed – which preserves the endpoint of a polygon or line and applies to PAEK Algorithm.

Algorithm Type

Simplification Tolerance(Km)

Time Taken (secs)

PAEK

1000

2

Bezier Interpolation

4

Fig 2.2: Display after Smooth Line technique was applied

__________ (Before Smoothing Line)

__________ (After Smoothing Line)

Observation

PAEK Algorithm: The tolerance value used here was so high to be able to physically see the changes made. PAEK Algorithm as applied on gower1 smoothed the curves around edges and eliminates unimportant points around the edges. This results in an image with fewer points as the tolerance value is increased. The output line does not pass through the input line vertices. This algorithm uses a syntax where the average of all the points is taken and for a particular vertex, which is substituted with the average coordinates of the next vertex. This is done sequentially for each vertex but displacement of the shape is averted by giving priority to the weighting of the central point than that of its neighbouring vertex.

Bezier Interpolation: Just like in Smoothing Polygon, a tolerance value is not required and when this technique was performed in this illustration, points around edges were partially retained resulting in drawing smooth curves around the vertices. The output line passes across the input line vertices.

Conclusion: From both illustrations just as in Smooth Polygon, PAEK Algorithm was considered most effective because it generates smoother curves around the edges as the tolerance value is increased. However, the true shape of the image can be gradually lost as this value is increased but with Bezier Interpolation; curves around the vertices are preserved but just smoothed and vertices maintained to as well.

Simplify Polygon:

This method is aimed at removing awkward bends around vertices while preserving its shape. There are two algorithms involved; Point Remove and Bend Simplify.

The shapefile used for this illustration is the polygon (Huntingdonshire) district of England. Select Simplify Polygon (under generalization tools, which is under Data Management tools > then input feature as polygon > output feature> simplification algorithm> smoothing tolerance.

Algorithm Type

Simplification Tolerance(Km)

Time Taken (secs)

Point Remove

2

4

Bend Simplify

2

9

Fig 2.3: Display before Simplify Polygon Fig 2.4: Display after Simplify Polygon

Point Remove Algorithm: This is a metamorphosis of the Douglas-Peucker algorithm and it applies the area/perimeter quotient which was first used in Wang algorithm (Wang, 1999, cited in ESRI, 2007). From the above experiment, as the tolerance value is increased, more vertices in the polygon were eliminated. This technique simplifies the polygon by reducing lots of vertices and by so doing it loses the original shape as the tolerance value is increased gradually.

Bend Simplify Algorithm: This algorithm was pioneered by Wang and Muller and it is aimed at simplifying shapes through detections around bent surfaces. It does this by eliminating insignificant vertices and the resultant output has better geometry preservation.

Observation: After applying both algorithms to the polygon above, it was seen that for point remove, the vertices reduced dramatically as the tolerance value was increased in multiples of 2km. This amounts to about 95% reduction while when the same approach was applied to Bend Simplify; there was about 30% reduction in the number of vertices. Bend Simplify also took longer time to execute.

Conclusion: It is seen that Bend Simplify is a better option when geometry is to be preserved however when the shape is to be represented on a smaller scale, point remove will be ideal because the shape is reduced significantly thereby appearing as a shrink image of its original.

Simplify Line

This is a similar procedure to Simplify Polygon except that here the shapefile to be considered is a line or a polygon which contains intersected lines. It is a process that involves reduction in the number of vertices that represent a line feature. This is achieved by reducing the number of vertices, preserving those that are more relevant and expunging those that are redundant such as repeated curves or area partitions without disrupting its original shape (Alves et al, 2010). Two layers are generated when this technique is performed; a line feature class and a point feature class. The former contains the simplified line while the latter contains vertices that have been simplified they can no longer be seen as a line but instead collapsed as a point. This applies to “Simplify Polygon” too. However, for both exercises no vertex was collapsed to a point feature.

Read also  Influence of IT on organisational behaviour

To illustrate this, the process is repeated in previous generalization technique, but under Data Management tools > select simplify line > select input feature (gower1) > select output feature > select the algorithm (point remove) > tolerance. Then accept all other defaults because we are not interested in the errors.

Algorithm Type

Simplification Tolerance(Km)

Time Taken (secs)

Point Remove

8

7

Bend Simplify

8

12

Fig 2.5: Display after Simplify Line

__________ (Before Simplifying Line)

__________ (After Simplifying Line)

Two algorithms are necessary for performing this operation; Point Remove and Bend Simplify.

Observation

Point Remove Algorithm: This method has been enumerated in “Simplify Polygon”. It is observed here that when point remove algorithm was used the lines in gower1 were redrawn such that vertices that occurred redundantly were removed and this became even more evident as the tolerance value increased such that the line had sharp angles around curves and its initial geometry is gradually lost.

Bend Simplify Algorithm: This also reduces the number of vertices in a line and the more the tolerance value was increased, the more the number of reduction in the vertices. It takes a longer time to execute than the Point Remove. However the originality of the line feature is preserved.

Conclusion: From the two practical exercises, Bend Simplify algorithm is more accurate because it preserves the line feature and its original shape is not too distorted. However, if the feature is to be represented on a much smaller scale and data compression is the factor considered here, then Point Remove will be an option to embrace.

Aggregate Polygon: This process involves amalgamating polygons of neighbouring boundaries. It merges separate polygons (both distinct ones and adjacent) and a new perimeter area is obtained which maintains the surface area of all the encompassing polygons that were merged together.

To illustrate this, select Data Management Tools > select aggregate polygons > select input feature (which is a selection of several districts from the England_dt_2001 area shapefile I downloaded) > output feature class > aggregation distance (boundary distance between polygons) and then I left other values as default.

Fig 2.6: Display before Aggregate Polygon Fig 2.7: Display after Aggregate Polygon

Aggregation Distance Used – 2km

Time Taken – 48secs

As seen from both figures, the districts in Fig 2.6 were joined together as seen in fig 2.3. As the aggregation distance is increased further, the separate districts are over-merged and the resultant image appears like a plain wide surface area till those hollow parts seen in fig 2.7 disappears. The algorithm used here which is inbuilt into the arcgis software is the Sort Tile Recursive tree. This algorithm computes all the nodes of neighbouring polygons by implementing the middle transversal method in a logical sequence from left to right. When this computation is complete, the result is stored as a referenced node. Now the middle transversal node in the tree is obtained and thereafter a mergence is calculated which spans from the left node to the right node until it get to the root of the tree (Xie, 2010)

2.6 Simplify Building: This process simplifies polygon shapes in form of buildings with the aim of preserving its original structure. To illustrate this, Simplify Building is chosen under Data Management tools. The appropriate fields are chosen; input feature here is a building shape file I extracted from MasterMap download of area code “CF37 1TW”.

a b c d

Fig 2.8: Display before Simplify Building Fig 2.9: Display after Simplify Building

As shown above, the buildings in (a and b) in fig 2.8 were simplified to (c and d) in fig 2.9 where a tolerance value of 10km was used and the time taken to execute this task was 3secs. As the tolerance value is increased, the more simplified the building is and it loses its shape. The algorithm behind this scene is the recursive approach which was first implemented with C++ programming language but has evolved into DLL (Dynamic Link Library) applications like ArcGIS 9.3

The recursive approach algorithm follows this sequence of steps.

Determining the angle of rotation α of the building, computing nodes around a boundary and then enclosing a small rectangular area which contains a set of points

The angle of rotation α is set

Determining the vertices around edges as regards the recursion used and thereafter to calculate the splitting rate µ and a recursive decomposition of the edge with respect to those of the new edges.

Read also  Zigbee Networks And Applications Information Technology Essay

The shortcoming of this algorithm is that ‘L’ and ‘Z’ shaped buildings are culprits as they give erroneous shapes while it works perfectly on ‘U’ and ‘L’ shaped buildings (Bayer, 2009).

2.7 Eliminate: This technique basically works on an input layer with a selection which can either take the form of ‘Select by Location’ or ‘Select by Attribute’ query. The resultant image now chunks off the selection and the remaining composites of the layerfile are now drawn out. To illustrate this, eliminate is chosen under data management tools, the input feature here is England_dt_2001 area shapefile which has some districts selected and the output feature is specified, all other fields left as defaults.

From Fig 3.0 after eliminated procedure was taken on the polygon (the green highlights being the selected features), the resultant polygon is shown in Fig 3.1. However the districts in Fig 3.1 now excludes all those selected in Fig 3.0 and this can be seen visually in labels ‘a’ and ‘b’ and therefore Fig 3.1 has fewer districts.

a b

Fig 3.0: Display before Eliminate process Fig 3.1: Display after Eliminate process

The time taken for this procedure was 44secs.

2.8 Dissolve: The dissolve tool works similarly to the aggregate polygon except that in dissolve, it is the features of the polygons that are to be aggregated and not the separate polygons themselves. The features are merged together using different statistic types more like an alias performed on them.

To illustrate this, click on Dissolve under Data Management tool, select input features- same used for ‘aggregate polygons’ (features to be aggregated) > the output field (where the result is to be saved) > the dissolve field (fields you want to aggregate together) > statistic type > multi_part > dissolve_lines. The diagram below shows this;

Observation: For this exercise, the ‘dissolve field’ was left as default meaning no field was selected. Also, multi_part was used which denotes that instead of merging smaller fields into a large one-the features becomes so extensive that if this is displayed on a map, there can be loss of performance however the multi_part option makes sure larger features are split into separate smaller ones. Dissolve_line field makes sure lines are dissolved into one feature while unsplit_lines only dissolve lines when two lines have an end node in common. The algorithm for this technique is simply Boolean (like a true or false situation, yes or no). However there are shortcomings with this technique as low virtual memory of the computer can limit the features that are to be dissolved. However, input features can be dissected into parts by an algorithm called adaptive tiling.

Fig 3.2: Display before Dissolve process Fig 3.3: Display after Dissolve process

Time taken = 10secs

2.9 Collapse Dual Lines: This is useful when centric lines are to be generated among two or more parallel lines with a specific width. This can be very useful when you have to consider large road networks in a block or casing. It enables you to visualize them properly. To illustrate this, open Collapse Dual Lines under data management tools > select input feature (which is gower1) > select the output feature > select maximum width

Maximum width (this is the maximum width of the casing allowed that contains the feature to be collapsed e.g. width of a road network) while the minimum width is the minimum value allowed to be able to denote its centric line from.

In this exercise, maximum width = 4km

Time taken = 4secs

Fig 3.4: Display after Collapse Dual Line to Centerline

__________ (Before Collapse Dual Line)

__________ (After Collapse Dual Line)

As seen above, it is observed that when this experiment was performed, those lines in blue are aftermaths of effect of procedure of operation on them because they had a red colour before. However those in red did not change because they did not have a width within the specified maximum width stated. However, this is going to change as the maximum width is increased or a minimum width is set.

3.0 Conclusion

From the illustrations shown in this paper, we can see that various forms of generalization tools have their various purposes either in form of shape retention, angular preservation or simply reduction purposes so that a replica image shown on a larger scale can fit in properly on a smaller scale. However depending on the tool chosen, a compromise will have to made on these factors giving preference to what it is we want to be represented after performing the operation. Different algorithms were explored and it is inferred that when polygons or lines are to be simplified, point remove is accurate option when you want to represent them on a smaller scale, however if originality of shape is to be considered then bend simplify algorithm will work best while for Smooth technique on polygons and lines, PAEK Algorithm is better.

Order Now

Order Now

Type of Paper
Subject
Deadline
Number of Pages
(275 words)