id
int64
0
25.6k
text
stringlengths
0
4.59k
23,200
weighted graphs the third agentin danza at this pointthe cheapest route you know that goes from ajo to any town without an agent is $ the direct route from ajo to danza both the ajo-bordo-colina route at $ and the ajo-bordo-danza route at $ are more expensive you hire another passerby and send her to danzawith an $ ticket she reports that from danza it' $ to colina and $ to erizo now you can modify your entry for colina beforeit was $ from ajogoing via bordo now you see you can reach colina for only $ going via danza alsoyou now know fare from ajo to the previously unknown erizoit' $ via danza you note these changesas shown in table and figure $ $ $ $ $ $ $ $ figure following step in the shortest-path algorithm table step agents at ajobordoand danza from ajo tobordo colina danza erizo step step step (via ajo (via ajo) (via ajo)inf (via bordo (via danza (via ajo (via ajo (via ajo)inf inf (via danzathe fourth agentin colina now the cheapest path to any town without an agent is the $ trip from ajo to colinagoing via danza accordinglyyou dispatch an agent over this route to colina he reports that it' $ from there to erizo now you can calculate thatsince colina is $ from ajo (via danza)and erizo is $ from colinayou can reduce the minimum ajo-to-erizo fare from $ (the ajo-danza-erizo routeto $ (the
23,201
ajo-danza-colina-erizo routeyou update your notebook accordinglyas shown in table and figure $ $ $ $ $ $ $ $ $ figure following step in the shortest-path algorithm table step agents in ajobordodanzaand colina from ajo tobordo colina danza erizo step step step step (via ajo (via ajo) (via ajo) (via ajo)inf (via bordo (via danza (via danza) (via ajo (via ajo (via ajo) (via ajo)inf inf (via danza (via colinathe last agentin erizo the cheapest path from ajo to any town you know about that doesn' have an agent is now $ to erizovia danza and colina you dispatch an agent to erizobut she reports that there are no routes from erizo to towns without agents (there' route to bordobut bordo has an agent table shows the final line in your notebookall you've done is add star to the erizo entry to show that an agent is there table step agents in ajobordodanzacolinaand erizo from ajo tobordo colina danza erizo step step step step step (via ajo (via ajo) (via ajo) (via ajo) (via ajo)inf (via bordo (via danza (via danza) (via danza) (via ajo (via ajo (via ajo) (via ajo) (via ajo)inf inf (via danza (via colina (via colina)
23,202
weighted graphs when there' an agent in every townyou know the fares from ajo to every other town so you're done with no further calculationsthe last line in your notebook shows the cheapest routes from ajo to all other towns this narrative has demonstrated the essentials of dijkstra' algorithm the key points are each time you send an agent to new townyou use the new information provided by that agent to revise your list of fares only the cheapest fare (that you know aboutfrom the starting point to given town is retained you always send the new agent to the town that has the cheapest path from the starting point (not the cheapest edge from any town with an agentas in the minimum spanning treeusing the graphdw workshop applet let' see how dijkstra' algorithm looks using the graphdw (for directed and weightedworkshop applet use the applet to create the graph from figure the result should look something like figure (we'll see how to make the table appear below the graph in moment this is weighteddirected graphso to make an edgeyou must type number before draggingand you must drag in the correct directionfrom the start to the destination figure the railroad scenario in graphdw when the graph is completeclick the path buttonand when promptedclick the vertex few more clicks on path will place in the treeshown with red circle around
23,203
the shortest-path array an additional click of the path button will install table under the graphas you can see in figure the corresponding message is copied row from adjacency matrix to shortest-path array dijkstra' algorithm starts by copying the appropriate row of the adjacency matrix (that isthe row for the starting vertexto an array (remember that you can examine the adjacency matrix at any time by pressing the view button this array is called the "shortest-patharray it corresponds to the most recent row of notebook entries you made while determining the cheapest train fares in magnaguena this array will hold the current versions of the shortest paths to the other verticeswhich we can call the destination vertices these destination vertices are represented by the column heads in table table step the shortest-path array inf( (ainf( (ainf(ain the appletthe shortest-path figures in the array are followed by the parent vertex enclosed in parentheses the parent is the vertex you reached just before you reached the destination vertex in this case the parents are all because we've moved only one edge away from if fare is unknown (or meaninglessas from to )it' shown as infinityrepresented by "inf,as in the rail-fare notebook entries notice that the column heads of those vertices that have already been added to the tree are shown in red the entries for these columns won' change minimum distance initiallythe algorithm knows the distances from to other vertices that are exactly one edge from only and are adjacent to aso they're the only ones whose distances are shown the algorithm picks the minimum distance another click on path will show you the message minimum distance from is to vertex the algorithm adds this vertex to the treeso the next click will show you added vertex to tree now is circled in the graphand the column head is in red the edge from to is made darker to show it' also part of the tree
23,204
weighted graphs column by column in the shortest-path array now the algorithm knows not only all the edges from abut the edges from as well so it goes through the shortest-path arraycolumn by columnchecking whether shorter path than that shown can be calculated using this new information vertices that are already in the treehere and bare skipped firstcolumn is examined you'll see the message to ca to ( plus edge bc ( less than to (infthe algorithm has found shorter path to than that shown in the array the array shows infinity in the column but from to is (which the algorithm finds in the column in the shortest-path arrayand from to is (which it finds in row column in the adjacency matrixthe sum is the distance is less than infinityso the algorithm updates the shortest-path array for column cinserting this is followed by in parenthesesbecause that' the last vertex before reaching cb is the parent of nextthe column is examined you'll see the message to da to ( plus edge bd ( greater than or equal to to ( the algorithm is comparing the previously shown distance from to dwhich is (the direct route)with possible route via (that isa- -dbut path - is and edge bd is so the sum is this is bigger than so is not changed for column ethe message is to ea to ( plus edge be (infgreater than or equal to to (infthe newly calculated route from to via ( plus infinityis still greater than or equal to the current one in the array (infinity)so the column is not changed the shortest-path array now looks like table table step the shortest-path array inf( ( ( (ainf(anow we can see more clearly the role played by the parent vertex shown in parentheses after each distance each column shows the distance from to an ending vertex the parent is the immediate predecessor of the ending vertex along the path from in column cthe parent vertex is bmeaning that the shortest path from to passes through just before it gets to this information is used by the algorithm to place the appropriate edge in the tree (when the distance is infinitythe parent vertex is meaningless and is shown as
23,205
new minimum distance now that the shortest-path array has been updatedthe algorithm finds the shortest distance in the arrayas you will see with another path keypress the message is minimum distance from is to vertex accordinglythe message added vertex to tree appears and the new vertex and edge ac are added to the tree do it again and again now the algorithm goes through the shortest-path array againchecking and updating the distances for destination vertices not in the treeonly and are still in this category column and are both updated the result is shown in table table step the shortest-path array inf( ( ( ( (dthe shortest path from to non-tree vertex is to vertex cso is added to the tree next time through the shortest-path arrayonly the distance to is considered it can be shortened by going via cso we have the entries shown in table table step the shortest-path array inf( ( ( ( (cnow the last vertexeis added to the treeand you're done the shortest-path array shows the shortest distances from to all the other vertices the tree consists of all the vertices and the edges abaddcand ceshown with thick lines you can work backward to reconstruct the sequence of vertices along the shortest path to any vertex for the shortest path to efor examplethe parent of eshown in the array in parenthesesis the predecessor of cagain from the arrayis dand the predecessor of is so the shortest path from to follows the route - - - experiment with other graphs using graphdwstarting with small ones you'll find that after while you can predict what the algorithm is going to doand you'll be on your way to understanding dijkstra' algorithm
23,206
weighted graphs java code the code for the shortest-path algorithm is among the most complex in this bookbut even so it' not beyond mere mortals we'll look first at helper class and then at the chief method that executes the algorithmpath()and finally at two methods called by path(to carry out specialized tasks the spath array and the distpar class as we've seenthe key data structure in the shortest-path algorithm is an array that keeps track of the minimum distances from the starting vertex to the other vertices (destination verticesduring the execution of the algorithmthese distances are changeduntil at the end they hold the actual shortest distances from the start in the example codethis array is called spath[(for shortest pathsas we've seenit' important to record not only the minimum distance from the starting vertex to each destination vertexbut also the path taken fortunatelythe entire path need not be explicitly stored it' only necessary to store the parent of the destination vertex the parent is the vertex reached just before the destination we've seen this in the workshop appletwhereif (dappears in the columnit means that the cheapest path from to is and is the last vertex before on this path there are several ways to keep track of the parent vertexbut we choose to combine the parent with the distance and put the resulting object into the spath[array we call this class of objects distpar (for distance-parentclass distpar /distance and parent /items stored in spath array public int distance/distance from start to this vertex public int parentvert/current parent of this vertex public distpar(int pvint ddistance dparentvert pv/constructor the path(method the path(method carries out the actual shortest-path algorithm it uses the distpar class and the vertex classwhich we saw in the mstw java program (listing the path(method is member of the graph classwhich we also saw in mstw java in somewhat different version
23,207
public void path(/find all shortest paths int starttree /start at vertex vertexlist[starttreeisintree truentree /put it in tree /transfer row of distances from adjmat to spath for(int = <nvertsj++int tempdist adjmat[starttree][ ]spath[jnew distpar(starttreetempdist)/until all vertices are in the tree while(ntree nvertsint indexmin getmin()/get minimum from spath int mindist spath[indexmindistanceif(mindist =infinity/if all infinite /or in treesystem out println("there are unreachable vertices")break/spath is complete else /reset currentvert currentvert indexmin/to closest vert starttocurrent spath[indexmindistance/minimum distance from starttree is /to currentvertand is starttocurrent /put current vertex in tree vertexlist[currentvertisintree truentree++adjust_spath()/update spath[array /end while(ntree<nvertsdisplaypaths()/display spath[contents ntree /clear tree for(int = <nvertsj++vertexlist[jisintree false/end path(
23,208
weighted graphs the starting vertex is always at index of the vertexlist[array the first task in path(is to put this vertex into the tree as the algorithm proceedswe'll be moving other vertices into the tree as well the vertex class contains flag that indicates whether vertex object is in the tree putting vertex in the tree consists of setting this flag and incrementing ntreewhich counts how many vertices are in the tree secondpath(copies the distances from the appropriate row of the adjacency matrix to spath[this is always row because for simplicity we assume is the index of the starting vertex initiallythe parent field of all the spath[entries is athe starting vertex we now enter the main while loop of the algorithm this loop terminates after all the vertices have been placed in the tree there are basically three actions in this loop choose the spath[entry with the minimum distance put the corresponding vertex (the column head for this entryin the tree this becomes the "current vertex,currentvert update all the spath[entries to reflect distances from currentvert if path(finds that the minimum distance is infinityit knows that some vertices are unreachable from the starting point whybecause not all the vertices are in the tree (the while loop hasn' terminated)and yet there' no way to get to these extra verticesif there werethere would be non-infinite distance before returningpath(displays the final contents of spath[by calling the displaypaths(method this is the only output from the program alsopath(sets ntree to and removes the isintree flags from all the verticesin case they might be used again by another algorithm (although they aren' in this programfinding the minimum distance with getmin(to find the spath[entry with the minimum distancepath(calls the getmin(method this routine is straightforwardit steps across the spath[entries and returns with the column number (the array indexof the entry with the minimum distance public int getmin(/get entry from spath /with minimum distance int mindist infinity/assume large minimum int indexmin for(int = <nvertsj++/for each vertex/if it' in tree and if!vertexlist[jisintree &/smaller than old one spath[jdistance mindist
23,209
mindist spath[jdistanceindexmin /update minimum /end for return indexmin/return index of minimum /end getmin(we could have used priority queue as the basis for the shortest-path algorithmas we did in the previous section to find the minimum spanning tree if we hadthe getmin(method would not have been necessarythe minimum-weight edge would have appeared automatically at the front of the queue howeverthe array approach shown makes it easier to see what' going on updating spath[with adjust_spath(the adjust_spath(method is used to update the spath[entries to reflect new information obtained from the vertex just inserted in the tree when this routine is calledcurrentvert has just been placed in the treeand starttocurrent is the current entry in spath[for this vertex the adjust_spath(method now examines each vertex entry in spath[]using the loop counter column to point to each vertex in turn for each spath[entryprovided the vertex is not in the treeit does three things it adds the distance to the current vertex (already calculated and now in starttocurrentto the edge distance from currentvert to the column vertex we call the result starttofringe it compares starttofringe with the current entry in spath[ if starttofringe is lessit replaces the entry in spath[this is the heart of dijkstra' algorithm it keeps spath[updated with the shortest distances to all the vertices that are currently known here' the code for adjust_spath()public void adjust_spath(/adjust values in shortest-path array spath int column /skip starting vertex while(column nverts/go across columns /if this column' vertex already in treeskip it ifvertexlist[columnisintree column++continue
23,210
weighted graphs /calculate distance for one spath entry /get edge from currentvert to column int currenttofringe adjmat[currentvert][column]/add distance from start int starttofringe starttocurrent currenttofringe/get distance of current spath entry int spathdist spath[columndistance/compare distance from start with spath entry if(starttofringe spathdist/if shorter/update spath spath[columnparentvert currentvertspath[columndistance starttofringecolumn++/end while(column nverts/end adjust_spath(the main(routine in the path java program creates the tree of figure and displays its shortest-path array here' the codepublic static void main(string[argsgraph thegraph new graph()thegraph addvertex(' ')/ (startthegraph addvertex(' ')/ thegraph addvertex(' ')/ thegraph addvertex(' ')/ thegraph addvertex(' ')/ thegraph addedge( )thegraph addedge( )thegraph addedge( )thegraph addedge( )thegraph addedge( )thegraph addedge( )thegraph addedge( )thegraph addedge( )/ab /ad /bc /bd /ce /dc /de /eb system out println("shortest paths")thegraph path()/shortest paths system out println()/end main(
23,211
the output of this program is =inf(ab= (ac= (dd= (ae= (cthe path java program listing shows the complete code for the path java program its various components were all discussed earlier listing the path java program /path java /demonstrates shortest path with weighteddirected graphs /to run this programc>java pathapp ///////////////////////////////////////////////////////////////class distpar /distance and parent /items stored in spath array public int distance/distance from start to this vertex public int parentvert/current parent of this vertex /public distpar(int pvint /constructor distance dparentvert pv/end class distpar //////////////////////////////////////////////////////////////class vertex public char label/label ( ' 'public boolean isintree/public vertex(char lab/constructor label labisintree false//end class vertex ///////////////////////////////////////////////////////////////class graph private final int max_verts
23,212
weighted graphs listing continued private final int infinity private vertex vertexlist[]/list of vertices private int adjmat[][]/adjacency matrix private int nverts/current number of vertices private int ntree/number of verts in tree private distpar spath[]/array for shortest-path data private int currentvert/current vertex private int starttocurrent/distance to currentvert /public graph(/constructor vertexlist new vertex[max_verts]/adjacency matrix adjmat new int[max_verts][max_verts]nverts ntree for(int = <max_vertsj++/set adjacency for(int = <max_vertsk++/matrix adjmat[ ][kinfinity/to infinity spath new distpar[max_verts]/shortest paths /end constructor /public void addvertex(char labvertexlist[nverts++new vertex(lab)/public void addedge(int startint endint weightadjmat[start][endweight/(directed/public void path(/find all shortest paths int starttree /start at vertex vertexlist[starttreeisintree truentree /put it in tree /transfer row of distances from adjmat to spath for(int = <nvertsj++
23,213
listing continued int tempdist adjmat[starttree][ ]spath[jnew distpar(starttreetempdist)/until all vertices are in the tree while(ntree nvertsint indexmin getmin()/get minimum from spath int mindist spath[indexmindistanceif(mindist =infinity/if all infinite /or in treesystem out println("there are unreachable vertices")break/spath is complete else /reset currentvert currentvert indexmin/to closest vert starttocurrent spath[indexmindistance/minimum distance from starttree is /to currentvertand is starttocurrent /put current vertex in tree vertexlist[currentvertisintree truentree++adjust_spath()/update spath[array /end while(ntree<nvertsdisplaypaths()/display spath[contents ntree /clear tree for(int = <nvertsj++vertexlist[jisintree false/end path(/public int getmin(/get entry from spath /with minimum distance int mindist infinity/assume minimum int indexmin for(int = <nvertsj++/for each vertex/if it' in tree and
23,214
weighted graphs listing continued if!vertexlist[jisintree &/smaller than old one spath[jdistance mindist mindist spath[jdistanceindexmin /update minimum /end for return indexmin/return index of minimum /end getmin(/public void adjust_spath(/adjust values in shortest-path array spath int column /skip starting vertex while(column nverts/go across columns /if this column' vertex already in treeskip it ifvertexlist[columnisintree column++continue/calculate distance for one spath entry /get edge from currentvert to column int currenttofringe adjmat[currentvert][column]/add distance from start int starttofringe starttocurrent currenttofringe/get distance of current spath entry int spathdist spath[columndistance/compare distance from start with spath entry if(starttofringe spathdist/if shorter/update spath spath[columnparentvert currentvertspath[columndistance starttofringecolumn++/end while(column nverts/end adjust_spath(/public void displaypaths(
23,215
listing continued for(int = <nvertsj++/display contents of spath[system out print(vertexlist[jlabel "=")/bif(spath[jdistance =infinitysystem out print("inf")/inf else system out print(spath[jdistance)/ char parent vertexlistspath[jparentvert labelsystem out print("(parent "")/(asystem out println("")//end class graph ///////////////////////////////////////////////////////////////class pathapp public static void main(string[argsgraph thegraph new graph()thegraph addvertex(' ')/ (startthegraph addvertex(' ')/ thegraph addvertex(' ')/ thegraph addvertex(' ')/ thegraph addvertex(' ')/ thegraph addedge( )thegraph addedge( )thegraph addedge( )thegraph addedge( )thegraph addedge( )thegraph addedge( )thegraph addedge( )thegraph addedge( )/ab /ad /bc /bd /ce /dc /de /eb system out println("shortest paths")thegraph path()/shortest paths system out println()/end main(/end class pathapp ///////////////////////////////////////////////////////////////
23,216
weighted graphs the all-pairs shortest-path problem in discussing connectivity in we wanted to know whether it was possible to fly from athens to murmansk if we didn' care how many stops we made with weighted graphs we can answer the second question that might occur to you as you wait at the hubris airlines ticket counterhow much will the journey costto find whether trip was possiblewe created connectivity table with weighted graphs we want table that gives the minimum cost from any vertex to any other vertex using multiple edges this is called the all-pairs shortest-path problem you can create such table by running the path java program using each vertex in turn as the starting vertex this will yield something like table table all-pairs shortest-path table -- -- -- --in the preceding we found that warshall' algorithm was quicker way to create table showing which vertices could be reached from given vertex using one or many steps an analogous approach for weighted graphs uses floyd' algorithmdiscovered by robert floyd in this is another way to create the kind of table shown in table let' discuss floyd' algorithm with simpler graph figure shows weighted directed graph and its adjacency matrix figure weighted graph and its adjacency matrix the adjacency matrix shows the cost of all the one-edge paths we want to extend this matrix to show the cost of all paths regardless of length for exampleit' clear
23,217
from figure that we can go from to at cost of ( from to plus from to cas in warshall' algorithm we systematically modify the adjacency matrix we examine every cell in every row if there' non-zero weight (say at row column )we then look in column (because is the row where the isif we find an entry in column (say at row )we know there is path from to with weight of and path from to with weight of from thiswe can deduce that there' two-edge path from to with weight of figure shows the steps when floyd' algorithm is applied to the graph in figure ay by cy so figure so so floyd' algorithm row is emptyso there' nothing to do there in row there' in column and in column dbut there' nothing in column bso the entries in row can' be combined with any edges ending on in row choweverwe find at column looking in column cwe find at row now we have to with weight of and to with weight of so we have to with weight of row shows an interesting situationwe will lower an existing cost there' in column there' also in row of column dso we know there' path from to with cost of howeverthere' already cost of in this cell what do we dobecause is less than we replace the with in the case of multiple paths from one vertex to anotherwe want the table to reflect the path with the lowest cost the implementation of floyd' algorithm is similar to that for warshall' algorithm howeverinstead of simply inserting into the table when two-edge path is foundwe add the costs of the two one-edge paths and insert the sum we'll leave the details as an exercise
23,218
weighted graphs efficiency so far we haven' discussed the efficiency of the various graph algorithms the issue is complicated by the two ways of representing graphsthe adjacency matrix and adjacency lists if an adjacency matrix is usedthe algorithms we've discussed mostly require ( timewhere is the number of vertices whyif you analyze the algorithmsyou'll see that they involve examining each vertex onceand for that vertex going across its row in the adjacency matrixlooking at each edge in turn in other wordseach cell of the adjacency matrixwhich has cellsis examined for large matrices ( isn' very good performance if the graph is densethere isn' much we can do about improving this performance (as we noted earlierby dense we mean graph that has many edges--one in which many or most of the cells in the adjacency matrix are filled howevermany graphs are sparsethe opposite of dense there' no clear-cut definition of how many edges graph must have to be described as sparse or densebut if each vertex in large graph is connected by only few edgesthe graph would normally be described as sparse in sparse graphrunning times can be improved by using the adjacency-list representation rather than the adjacency matrix this is easy to understandyou don' waste time examining adjacency-matrix cells that don' hold edges for unweighted graphs the depth-first search with adjacency lists requires ( +etimewhere is the number of vertices and is the number of edges for weighted graphsboth the minimum spanning tree and the shortest-path algorithm require (( + )logvtime in largesparse graphs these times can represent dramatic improvements over the adjacency matrix approach howeverthe algorithms are somewhat more complicatedwhich is why we've used the adjacency-matrix approach throughout this you can consult sedgewick (see appendix "further reading"and other writers for examples of graph algorithms using the adjacency-list approach warshall' and floyd' algorithms are slower than the other algorithms we've discussed so far in this book they both operate in ( time this is the result of the three nested loops used in their implementation intractable problems in this book we've seen big values ranging from ( )through ( ) ( *logn) ( )up to (for warshall' and floyd' algorithmso( even ( can be solved in reasonable length of time for values in the thousands algorithms with these big values can be used to find solutions to most practical problems
23,219
howeversome algorithms have big values that are so large that they can be used only for relatively small values of many real-world problems that require such algorithms simply cannot be solved in reasonable length of time such problems are said to be intractable (another term used for such problems is np completewhere np means non-deterministic polynomial an explanation of what this means is beyond the scope of this book the knight' tour the knight' tour (programming project in is an example of an intractable problem because the number of possible moves is so large the total number of possible move sequences is difficult to calculatebut we can approximate it each move can end on maximum of eight squares this number is reduced by moves that would be off the edge of the board and moves that would end on square that was already visited in the early stages of tourthere will be closer to eight movesbut this number will gradually decrease as the board fills up let' assume (conservativelyan average of only two possible moves from each position after the initial squarethe knight can visit more squares thusthere is total of possible moves this is about assume computer can make million moves second ( there are roughly seconds in yearso the computer can make moves in year solving the puzzle by brute force can therefore be expected to take or around million years this particular problem can be made more tractable if strategies are used to "prunethe game tree one is warnsdorff' heuristic ( von warnsdorff )which specifies that you always move to the square that has the fewest possible exit moves the traveling salesman problem here' another famous intractable problem suppose you're salesperson and you need to drive to all the cities where you have clients you would like to minimize the number of miles you travel you know the distance from each city to every other city you want to start in your home cityvisit each client city once and only onceand return to your home city in what sequence should you visit these cities to minimize the total miles traveledin graph theory this is called the traveling salesman problemoften abbreviated tsp figure shows an arrangement of cities and distances what' the shortest way to travel from through each of the other cities and back to anotice that it' not necessary that every pair of cities be connected by an edge because of geography it may be impossible to drive from washingtond to new york without going through philadelphiafor example
23,220
weighted graphs figure cities and distances to find the shortest routeyou list all the possible permutations of cities (bostonseattle-miamiboston-miami-seattlemiami-boston-seattleand so onand calculate the total distance for each permutation the route abceda has total length of the route abcdea is impossible because there is no edge from back to unfortunatelythe number of permutations can be very largeit' the factorial of the number of cities (not counting your home cityif there are cities to visitthere are choices for the first city for the second for the thirdand so ona total of * * * * * or possible routes the problem is impractical to solve for even cities againthere are strategies to reduce the number of sequences that must be checkedbut this helps only little weighted graph is used to implement the problemwith weights representing miles and vertices representing the cities the graph can be non-directed if the distance is the same going from to as from to aas it usually is when driving if the weights represent airfaresthey may be different in different directionsin which case directed graph is used hamiltonian cycles problem that' similar to the tsp but more abstract is that of finding the hamiltonian cycle of graph as we noted earliera cycle is path that starts and ends on the same vertex hamiltonian cycle is one that visits every other vertex in the graph exactly once unlike we did with the tspwe don' care about distancesall we want to know is whether such cycle exists in figure the route abceda is hamiltonian cyclewhile abcdea is not the knight' tour problem is an example of hamiltonian cycle (if you assume the knight returns to its starting squarefinding hamiltonian cycle takes the same ( !time as the tsp you'll see the term exponential time used for big values such as and (which grows even more rapidly than the exponential
23,221
summary in weighted graphedges have an associated number called the weightwhich might represent distancescoststimesor other quantities the minimum spanning tree in weighted graph minimizes the weights of the edges necessary to connect all the vertices an algorithm using priority queue can be used to find the minimum spanning tree of weighted graph the minimum spanning tree of weighted graph models real-world situations such as installing utility cables between cities the shortest-path problem in non-weighted graph involves finding the minimum number of edges between two vertices solving the shortest-path problem for weighted graphs yields the path with the minimum total edge weight the shortest-path problem for weighted graphs can be solved with dijkstra' algorithm the algorithms for largesparse graphs generally run much faster if the adjacency-list representation of the graph is used rather than the adjacency matrix the all-pairs shortest-path problem is to find the total weight of the edges between every pair of vertices in graph floyd' algorithm can be used to solve this problem some graph algorithms take exponential time and are therefore not practical for graphs with more than few vertices questions these questions are intended as self-test for readers answers may be found in appendix the weight in weighted graph is property of the graph' in weighted graphthe minimum spanning tree (msttries to minimize the number of edges from the starting vertex to specified vertex the number of edges connecting all the vertices the total weight of the edges from the starting vertex to specified vertex the total weight of edges connecting all the vertices
23,222
weighted graphs true or falsethe weight of the mst depends on the starting vertex in the mst algorithmwhat is removed from the priority queue in the cable tv exampleeach edge added to the mst connects the starting vertex to an adjacent vertex an already-connected city to an unconnected city the current vertex to an adjacent vertex two cities with offices the mst algorithm "prunesan edge from the list when the edge leads to vertex that true or falsethe shortest-path problem (sppmust be carried out on directed graph dijkstra' algorithm finds the shortest path from one specified vertex to all other vertices from one specified vertex to another specified vertex from all vertices to all other vertices that can be reached along one edge from all vertices to all other vertices that can be reached along multiple edges true or falsethe rule in dijkstra' algorithm is to always put in the tree the vertex that is closest to the starting vertex in the railroad fares examplea fringe town is one to which the distance is knownbut from which no distances are known which is in the tree to which the distance is known and which is in the tree which is completely unknown the all-pairs shortest-path problem involves finding the shortest path from the starting vertex to every other vertex from every vertex to every other vertex from the starting vertex to every vertex that is one edge away from every vertex to every other vertex that is one or more edges away
23,223
floyd' algorithm is to weighted graphs what is to unweighted graphs floyd' algorithm uses the representation of graph what is an approximate big time for an attempt to solve the knight' tour in figure is the route abceda the minimum solution for the traveling salesman problemexperiments carrying out these experiments will help to provide insights into the topics covered in the no programming is involved use the graphw workshop applet to find the minimum spanning tree of the graph shown in figure "train fares in magnaguena consider the graph to be undirectedthat isignore the arrows use the graphdw workshop applet to solve the shortest-path problem for the graph in figure "train fares in magnaguena,but derive new weights for all the edges by subtracting those shown in the figure from draw graph with five vertices and five edges then use pencil and paper to implement djikstra' algorithm for this graph show the tree and the shortestpath array at each step programming projects writing programs to solve the programming projects helps to solidify your understanding of the material and demonstrates how the concepts are applied (as noted in the introductionqualified instructors may obtain completed solutions to the programming projects on the publisher' web site modify the path java program (listing to print table of the minimum costs to get from any vertex to any other vertex this exercise will require some fiddling with routines that assume the starting vertex is always so far we've implemented graphs as adjacency matrices or adjacency lists another approach is to use java references to represent edgesso that vertex object contains list of references to other vertices that it' connected to in directed graph reference used this way is especially intuitive because it "pointsfrom one vertex to another write program that implements this scheme the main(method should be similar to main(in the path java program (listing so that it creates the graph shown in figure using the same addvertex(and addedge(calls it should then display connectivity
23,224
weighted graphs table of the graph to prove that the graph is constructed properly you'll need to store the weight of each edge somewhere one approach is to use an edge classwhich stores its weight and the vertex on which it ends each vertex then keeps list of edge objects--that isedges that start on that vertex implement floyd' algorithm you can start with the path java program (listing and modify it as appropriate for instanceyou can delete all the shortestpath code keep the infinity representation for unreachable vertices by doing thisyou will avoid the need to check for when comparing an existing cost with newly derived cost the costs on all possible routes will be less than infinity you should be able to enter graphs of arbitrary complexity into main( implement the traveling salesman problem described in the "intractable problemssection in this in spite of its intractabilityit will have no trouble solving the problem for small nsay cities or fewer try nondirected graph use the brute-force approach of testing every possible sequence of cities for way to permute the sequence of citiessee the anagram java program (listing in "recursion use infinity to represent nonexistent edges that wayyou won' need to abort the calculation of sequence when it turns out that an edge from one city to the next does not existany total greater than infinity is an impossible route alsodon' worry about eliminating symmetrical routes display both abcdea and aedcbafor example write program that discovers and displays all the hamiltonian cycles of weightednon-directed graph
23,225
when to use what in this general-purpose data structures special-purpose data in this we briefly summarize what we've learned so farwith an eye toward deciding what data structure or algorithm to use in particular situation this comes with the usual caveats of necessityit' very general every real-world situation is uniqueso what we say here may not be the right answer to your problem this is divided into these somewhat arbitrary sectionsgeneral-purpose data structuresarrayslinked liststreeshash tables specialized data structuresstacksqueuespriority queuesgraphs sortinginsertion sortshellsortquicksortmergesortheapsort graphsadjacency matrixadjacency list external storagesequential storageindexed filesb-treeshashing note for detailed information on these topicsrefer to the individual in this book general-purpose data structures if you need to store real-world data such as personnel recordsinventoriescontact listsor sales datayou need general-purpose data structure the structures of this type that we've discussed in this book are arrayslinked liststreesand hash tables we call these general-purpose data structures sorting graphs external storage
23,226
when to use what structures because they are used to store and retrieve data using key values this works for general-purpose database programs (as opposed to specialized structures such as stackswhich allow access to only certain data itemswhich of these general-purpose data structures is appropriate for given problemfigure shows first approximation to this question howeverthere are many factors besides those shown in the figure for more detailwe'll explore some general considerations first and then zero in on the individual structures small amount of data start no hash table yes searching and insertion must be very fast no binary search tree yes key distribution guaranteed random yes amount of data predictable no linked list yes search speed more yes important than insertion speed ordered array no unordered array no balanced tree figure relationship of general-purpose data structures speed and algorithms the general-purpose data structures can be roughly arranged in terms of speedarrays and linked lists are slowtrees are fairly fastand hash tables are very fast howeverdon' draw the conclusion from figure that it' always best to use the fastest structures there' penalty for using them firstthey are--in varying degrees--more complex to program than the array and linked list alsohash tables require you to know in advance about how much data can be storedand they don' use memory very efficiently ordinary binary trees will revert to slow (noperation for ordered dataand balanced treeswhich avoid this problemare difficult to program
23,227
processor speeda moving target the fast structures come with penaltiesand another development makes the slow structures more attractive every year there' an increase in the cpu and memoryaccess speed of the latest computers moore' law (postulated by gordon moore in specifies that cpu performance will double every months this adds up to an astonishing difference in performance between the earliest computers and those available todayand there' no reason to think this increase will slow down any time soon suppose computer few years ago handled an array of objects in acceptable time nowcomputers are much fasterso an array with , objects might run at the same speed many writers provide estimates of the maximum size you can make data structure before it becomes too slow don' trust these estimates (including those in this booktoday' estimate doesn' apply to tomorrow insteadstart by considering the simple data structures unless it' obvious they'll be too slowcode simple version of an array or linked list and see what happens if it runs in acceptable timelook no further why slave away on balanced tree when no one would ever notice if you used an array insteadeven if you must deal with thousands or tens of thousands of itemsit' still worthwhile to see how well an array or linked list will handle them only when experimentation shows their performance to be too slow should you revert to more sophisticated data structures advantages of java references java has an advantage over some languages in the speed with which objects can be manipulatedbecausein most data structuresjava stores only referencesnot actual objects thereforemost algorithms will run faster than in languages where actual objects occupy space in data structure in analyzing the algorithmsit' not the caseas when objects themselves are storedthat the time to "movean object depends on the size of the object because only reference is movedit doesn' matter how large the object is of coursein other languagessuch as ++pointers to objects can be stored instead of the objects themselvesthis has the same effect as using referencesbut the syntax is more complicated libraries libraries of data structures are available commercially in all major programming languages languages themselves may have some structures built in javafor exampleincludes vectorstackand hashtable classes +includes the standard template library (stl)which contains classes for many data structures and algorithms
23,228
when to use what using commercial library may eliminate or at least reduce the programming necessary to create the data structures described in this book when that' the caseusing complex structure such as balanced treeor delicate algorithm such as quicksortbecomes more attractive possibility howeveryou must ensure that the class can be adapted to your particular situation arrays in many situations the array is the first kind of structure you should consider when storing and manipulating data arrays are useful when the amount of data is reasonably small the amount of data is predictable in advance if you have plenty of memoryyou can relax the second conditionjust make the array big enough to handle any foreseeable influx of data if insertion speed is importantuse an unordered array if search speed is importantuse an ordered array with binary search deletion is always slow in arrays because an average of half the items must be moved to fill in the newly vacated cell traversal is fast in an ordered array but not supported in an unordered array vectorssuch as the vector class supplied with javaare arrays that expand themselves when they become too full vectors may work when the amount of data isn' known in advance howeverthere may periodically be significant pause while they enlarge themselves by copying the old data into the new space linked lists consider linked list whenever the amount of data to be stored cannot be predicted in advance or when data will frequently be inserted and deleted the linked list obtains whatever storage it needs as new items are addedso it can expand to fill all of available memoryand there is no need to fill "holesduring deletionas there is in arrays insertion is fast in an unordered list searching and deletion are slow (although deletion is faster than in an array)solike arrayslinked lists are best used when the amount of data is comparatively small linked list is somewhat more complicated to program than an arraybut is simple compared with tree or hash table binary search trees binary tree is the first structure to consider when arrays and linked lists prove too slow tree provides fast (logninsertionsearchingand deletion traversal is
23,229
( )which is the maximum for any data structure (by definitionyou must visit every itemyou can also find the minimum and maximum quickly and traverse range of items an unbalanced binary tree is much easier to program than balanced treebut unfortunately ordered data can reduce its performance to (ntimeno better than linked list howeverif you're sure the data will arrive in random orderthere' no point using balanced tree balanced trees of the various kinds of balanced treeswe discussed red-black trees and trees they are both balanced treesand thus guarantee (lognperformance whether the input data is ordered or not howeverthese balanced trees are challenging to programwith the red-black tree being the more difficult they also impose additional memory overheadwhich may or may not be significant the problem of complex programming may be reduced if commercial class can be used for tree in some cases hash table may be better choice than balanced tree hash-table performance doesn' degrade when the data is ordered there are other kinds of balanced treesincluding avl treessplay trees - treesand so onbut they are not as commonly used as the red-black tree hash tables hash tables are the fastest data storage structure this makes them necessity for situations in which computer programrather than humanis interacting with the data hash tables are typically used in spelling checkers and as symbol tables in computer language compilerswhere program must check thousands of words or symbols in fraction of second hash tables may also be useful when personas opposed to computerinitiates data-access operations as noted earlierhash tables are not sensitive to the order in which data is insertedand so can take the place of balanced tree programming is much simpler than for balanced trees hash tables require additional memoryespecially for open addressing alsothe amount of data to be stored must be known fairly accurately in advancebecause an array is used as the underlying structure hash table with separate chaining is the most robust implementationunless the amount of data is known accurately in advancein which case open addressing offers simpler programming because no linked list class is required hash tables don' support any kind of ordered traversalor access to the minimum or maximum items if these capabilities are importantthe binary search tree is better choice
23,230
when to use what comparing the general-purpose storage structures table summarizes the speeds of the various general-purpose data storage structures using big notation table general-purpose data storage structures data structure search insertion deletion traversal array ordered array linked list ordered linked list binary tree (averagebinary tree (worst casebalanced tree (average and worst caseo(no(logno(no(no(logno(no(logno( (no( (no(logno(no(logno(no(no(no(no(logno(no(logn- ( - (no(no(no(nhash table ( ( ( -insertion in an unordered array is assumed to be at the end of the array the ordered array uses binary searchwhich is fastbut insertion and deletion require moving half the items on the averagewhich is slow traversal implies visiting the data items in order of ascending or descending keysthe -means this operation is not supported special-purpose data structures the special-purpose data structures discussed in this book are the stackthe queueand the priority queue these structuresrather than supporting database of useraccessible dataare usually used by computer program to aid in carrying out some algorithm we've seen examples of this throughout this booksuch as in "graphs,and "weighted graphs,where stacksqueuesand priority queues are all used in graph algorithms stacksqueuesand priority queues are abstract data types (adtsthat are implemented by more fundamental structure such as an arraylinked listor (in the case of the priority queuea heap these adts present simple interface to the usertypically allowing only insertion and the ability to access or delete only one data item these items are for stacksthe last item inserted for queuesthe first item inserted for priority queuesthe item with the highest priority
23,231
these adts can be seen as conceptual aids their functionality could be obtained using the underlying structure (such as an arraydirectlybut the reduced interface they offer simplifies many problems these adts can' be conveniently searched for an item by key value or traversed stack stack is used when you want access only to the last data item insertedit' lastin-first-out (lifostructure stack is often implemented as an array or linked list the array implementation is efficient because the most recently inserted item is placed at the end of the arraywhere it' also easy to delete stack overflow can occurbut is not likely if the array is reasonably sizedbecause stacks seldom contain huge amounts of data if the stack will contain lot of data and the amount can' be predicted accurately in advance (as when recursion is implemented as stack) linked list is better choice than an array linked list is efficient because items can be inserted and deleted quickly from the head of the list stack overflow can' occur (unless the entire memory is fulla linked list is slightly slower than an array because memory allocation is necessary to create new link for insertionand deallocation of the link is necessary at some point following removal of an item from the list queue queue is used when you want access only to the first data item insertedit' first-in-first-out (fifostructure like stacksqueues can be implemented as arrays or linked lists both are efficient the array requires additional programming to handle the situation in which the queue wraps around at the end of the array linked list must be double-endedto allow insertions at one end and deletions at the other as with stacksthe choice between an array implementation and linked list implementation is determined by how well the amount of data can be predicted use the array if you know about how much data there will beotherwiseuse linked list priority queue priority queue is used when the only access desired is to the data item with the highest priority this is the item with the largest (or sometimes the smallestkey priority queues can be implemented as an ordered array or as heap insertion into an ordered array is slowbut deletion is fast with the heap implementationboth insertion and deletion take (logntime
23,232
when to use what use an array or double-ended linked list if insertion speed is not problem the array works when the amount of data to be stored can be predicted in advancethe linked list when the amount of data is unknown if speed is importanta heap is better choice comparison of special-purpose structures table shows the big times for stacksqueuesand priority queues these structures don' support searching or traversal table special-purpose data storage structures data structure insertion deletion comment stack (array or linked listqueue (array or linked listpriority queue (ordered arraypriority queue (heapo( ( ( ( (no( (logno(logndeletes most recently inserted item deletes least recently inserted item deletes highest-priority item deletes highest-priority item sorting as with the choice of data structuresit' worthwhile initially to try slow but simple sortsuch as the insertion sort it may be that the fast processing speeds available in modern computers will allow sorting of your data in reasonable time (as wild guessthe slow sort might be appropriate for fewer than , items insertion sort is also good for almost-sorted filesoperating in about (ntime if not too many items are out of place this is typically the case where few new items are added to an already-sorted file if the insertion sort proves too slowthen the shellsort is the next candidate it' fairly easy to implementand not very temperamental sedgewick estimates it to be useful up to , items only when the shellsort proves too slow should you use one of the more complex but faster sortsmergesortheapsortor quicksort mergesort requires extra memoryheapsort requires heap data structureand both are somewhat slower than quicksortso quicksort is the usual choice when the fastest sorting time is necessary howeverquicksort is suspect if there' danger that the data may not be randomin which case it may deteriorate to ( performance for potentially non-random
23,233
dataheapsort is better quicksort is also prone to subtle errors if it is not implemented correctly small mistakes in coding can make it work poorly for certain arrangements of dataa situation that may be hard to diagnose table summarizes the running time for various sorting algorithms the column labeled comparison attempts to estimate the minor speed differences between algorithms with the same average big times (there' no entry for shellsort because there are no other algorithms with the same big performance table comparison of sorting algorithms sort average worst comparison extra memory bubble selection insertion shellsort quicksort mergesort heapsort ( ( ( ( / ( *logno( *logno( *logno( ( ( ( / ( ( *logno( *lognpoor fair good -good fair fair no no no no no yes no graphs graphs are unique in the pantheon of data storage structures they don' store general-purpose dataand they don' act as programmer' tools for use in other algorithms insteadthey directly model real-world situations the structure of the graph reflects the structure of the problem when you need graphnothing else will doso there' no decision to be made about when to use one the primary choice is how to represent the graphusing an adjacency matrix or adjacency lists your choice depends on whether the graph is fullwhen the adjacency matrix is preferredor sparsewhen the adjacency list should be used the depth-first search and breadth-first search run in ( timewhere is the number of verticesfor adjacency matrix representation they run in ( +etimewhere is the number of edgesfor adjacency list representation minimum spanning trees and shortest paths run in ( time using an adjacency matrix and (( + )logvtime using adjacency lists you'll need to estimate and for your graph and do the arithmetic to see which representation is appropriate external storage in the previous discussion we assumed that data was stored in main memory howeveramounts of data too large to store in memory must be stored in external
23,234
when to use what storagewhich generally means disk files we discussed external storage in the second parts of trees and external storage,and "hash tables we assumed that data is stored in disk file in fixed-size units called blockseach of which holds number of records ( record in disk file holds the same sort of data as an object in main memory like an objecta record has key value used to access it we also assumed that reading and writing operations always involve single blockand these read and write operations are far more time-consuming than any processing of data in main memory thusfor fast operation the number of disk accesses must be minimized sequential storage the simplest approach is to store records randomly and read them sequentially when searching for one with particular key new records can simply be inserted at the end of the file deleted records can be marked as deletedor records can be shifted down (as in an arrayto fill in the gap on the averagesearching and deletion will involve reading half the blocksso sequential storage is not very fastoperating in (ntime stillit might be satisfactory for small number of records indexed files speed is increased dramatically when indexed files are used in this scheme an index of keys and corresponding block numbers is kept in main memory to access record with specified keythe index is consulted it supplies the block number for the keyand only one block needs to be readtaking ( time several indices with different kinds of keys can be used (one for last namesone for social security numbersand so onthis scheme works well until the index becomes too large to fit in memory typicallythe index files are themselves stored on disk and read into memory as needed the disadvantage of indexed files is that at some point the index must be created this probably involves reading through the file sequentiallyso creating the index is slow alsothe index will need to be updated when items are added to the file -trees -trees are multiway treescommonly used in external storagein which nodes correspond to blocks on the disk as in other treesthe algorithms find their way down the treereading one block at each level -trees provide searchinginsertionand
23,235
deletion of records in (logntime this is quite fast and works even for very large files howeverthe programming is not trivial hashing if it' acceptable to use about twice as much external storage as file would normally takethen external hashing might be good choice it has the same access time as indexed fileso( )but can handle larger files figure showsrather impressionisticallythese choices for external storage structures start speed important no sequential search yes speed critical no -tree yes extra storage available yes external hashing no indexed files figure relationship of external storage choices virtual memory sometimes you can let your operating system' virtual memory capabilities (if it has themsolve disk access problems with little programming effort on your part if you read file that' too big to fit in main memorythe virtual memory system will read in that part of the file that fits and store the rest on the disk as you access different parts of the filethey will be read from the disk automatically and placed in memory
23,236
when to use what you can apply internal algorithms to the entire file just as if it was all in memory at the same timeand let the operating system worry about reading the appropriate part of the file if it isn' in memory already of courseoperation will be much slower than when the entire file is in memorybut this would also be true if you dealt with the file block by block using one of the external-storage algorithms it may be worth simply ignoring the fact that file doesn' fit in memory and seeing how well your algorithms work with the help of virtual memory especially for files that aren' much larger than the available memorythis may be an easy solution onward we've come to the end of our survey of data structures and algorithms the subject is large and complexso no one book can make you an expertbut we hope this book has made it easy for you to learn about the fundamentals appendix "further reading,contains suggestions for further study
23,237
running the workshop applets and example programs in this appendix the workshop applets the example programs the sun microsystem' software development kit multiple class files other development systems in this appendix we discuss the details of running the workshop applets and the example programs the workshop applets are graphics-based demonstration programs that show what trees and other data structures look like the example programswhose code is shown in the textpresent runnable java code we also discuss the sun microsystems java standard edition ( sesoftware development kit (sdk)which you can use not only to run the applets and example programs in this book but to modify the example programs and to write your own programs downloadable versions of this book' applets and example programs are available on the sams web sitewww samspublishing com log onand use the book' international standard book number (isbnto access the book' web pagewhere you'll find link to the downloads the workshop applets an applet is special kind of java program that is easy to send over the internet' world wide web because java applets are designed for the internetthey can run on any computer platform that has an appropriate web browser or applet viewer
23,238
appendix running the workshop applets and example programs in this bookthe workshop applets provide dynamicinteractive graphics-based demonstrations of the concepts discussed in the text for example"binary trees,includes workshop applet that shows tree in the applet window clicking the applet' buttons will show the steps involved in inserting new node into the treedeleting an existing nodetraversing the treeand so on other include appropriate workshop applets you can run the workshop applets immediately after downloading themusing most popular web browsers this includes the current versions of microsoft internet explorer and netscape communicator commercial java development products also have an applet viewer utility that runs applets you can also run the workshop applets with the appletviewer utility included with the sdk here' how to run the applets with typical web browser working offlineselect open from the file menu and navigate to the appropriate directory each workshop applet consists of subdirectory containing several files with the class extension and one file with the html extension open the html file the applet should appear on the screen the example programs the example programs are intended to show as simply as possible how the data structures and algorithms discussed in this book can be implemented in java these example programs consist of java applications (as opposed to appletsjava applications are not meant to be sent over the webbut instead run as normal programs on specific machine java applications can run in either console mode or graphics mode for simplicityour example programs run in console modewhich means that output is displayed as text and input is performed by the user typing at the keyboard in the windows environment the console mode runs in an ms-dos box there is no graphics display in console mode the source code for the example programs is presented in the text of the book source filesconsisting of the same text as in the bookcan be downloaded from the sams web site the sun microsystem' software development kit both the workshop applets and the example programs can be executed using utility programs that are part of sun' sdk the sdk can be downloaded from sun' web sitewww sun com look for the java standard edition ( sesoftware development kit this is large downloadbut it gives you everything you need not only to run the applets and programs in this bookbut to develop your own java applets and applications
23,239
command-line programs the sdk operates in text modeusing the command line to launch its various programs in windowsyou'll need to open an ms-dos box to obtain this command line click the start buttonand find the program called ms-dos prompt it may be in the accessories folderand it may be called something elselike command prompt thenin ms-dosuse the cd (for change directorycommand to move to the appropriate subdirectory on your hard diskwhere either workshop applet or an example program is stored then execute the applet or program using the appropriate sdk utility as detailed below setting the path in windowsthe location of the sdk utility programs should be specified in path statement in the autoexec bat file so they can be accessed conveniently from within any subdirectory this path statement may be placed automatically in your autoexec bat file when you run the setup program for the sdk otherwiseuse the notepad utility to insert the line set path= :\jdk\bininto the autoexec bat filefollowing any other set path commands you'll find autoexec bat in your root directory close the ms-dos box and open new one to activate this new path (modify the version number and directory name as necessary viewing the workshop applets to use the sdk to run the workshop appletsfirst use the cd command in ms-dos to navigate to the desired subdirectory for exampleto execute the array workshop applet from "arrays,move to its directoryc:\>cd javaapps :\javaapps>cd chap :\javaapps\chap >cd array then use the appletviewer utility from the sdk to execute the applet' html filec:\javaapps\chap \array>appletviewer array html the applet should start running (sometimes an applet takes while to loadso be patient the applet' appearance should be close to the screen shots shown in the text it won' look exactly the same because every applet viewer and browser interprets html and java format somewhat differently
23,240
appendix running the workshop applets and example programs as we notedyou can also use most web browsers to execute the applets operating the workshop applets each gives instructions for operating specific workshop applets in generalremember that in most cases you'll need to repeatedly click single button to carry out an operation each press of the ins button in the array workshop appletfor examplecauses one step of the insertion process to be carried out generallya message is displayed telling what' happening at each step you should complete each operation--that iseach sequence of button clicks--before clicking different button to start different operation for examplekeep clicking the find button until the item with the specified key is locatedand you see the message press any button only then should you switch to another operation involving another buttonsuch as inserting new item with the ins button the sorting applets from "simple sorting,and "advanced sorting,have step button with which you can view the sorting process one step at time they also have run mode in which the sort runs at high speed without additional button clicks just click the run button once and watch the bars sort themselves to pauseyou can click the step button at any time running can be resumed by clicking the run button again it' not intended that you study the code for the workshop appletswhich is mostly concerned with the graphic presentation hencesource listings are not provided running the example programs each example program consists of subdirectory containing java file and number of class files the java file is the source filewhich also appears in the text it must be compiled before you can run it the class files are compiled and ready to run if you have java interpreter you can use the java interpreter from sun' sdk to run the example programs directly from the class files for each programone class file ends with the letters appfor application it' this file that must be invoked with java from an ms-dos promptgo to the appropriate subdirectory (using the cd commandand find this app file for examplefor the insertsort program of go to the insertsort subdirectory for (don' confuse the directory holding the applets with the directory holding the example programs you'll find java file and several class files one of these is insertsortapp class to execute the programenter :\chap \insertsort>java insertsortapp
23,241
don' type file extension after the filename the insertsort program should runand you'll see text display of unsorted and sorted data in some example programs you'll see prompt inviting you to enter inputwhich you type at the keyboard compiling the example programs you can experiment with the example programs by modifying them and then compiling and running the modified versions you can also write your own applications from scratchcompile themand run them to compile java applicationyou use the javac programinvoking the example' java file for exampleto compile the insertsort programyou would go to the insertsort directory and enter :\chap \insertsort>javac insertsort java this time you do need to add the java file extension this command will compile the java file into as many class files as there are classes in the program if there are errors in the source codeyou'll see them displayed on the screen editing the source code many text editors are appropriate for modifying the java source files or writing new ones for exampleyou can invoke an ms-dos editor called edit from the dos command lineand windows includes the notepad editor many commercial text editors are available as well don' use fancy word processorsuch as microsoft wordfor editing source files word processors typically generate output files with strange characters and formatting informationwhich the java interpreter won' understand terminating the example programs you can terminate any running console-mode programincluding any of the example programsby pressing the - key combination (the control key and the key pressed at the same timesome example programs have termination procedure that' mentioned in the textsuch as pressing at the beginning of linebut for the others you must press - multiple class files often several workshop appletsor several example programswill use class files with the same names notehoweverthat these files may not be identical the applet or example program may not work if the wrong class file is used with iteven if the file has the correct name
23,242
appendix running the workshop applets and example programs invoking the wrong file should not normally be problem because all the files for given program are placed in the same subdirectory howeverif you move files by handbe careful not to inadvertently copy file to the wrong directory doing this may cause problems that are hard to trace other development systems there are many other java development systems besides sun' sdk products are available from symantecmicrosoftborlandand so on sun itself has java development system called sun one studio (it was formerly called fortethese products are generally faster and more convenient to use than the sdk they typically combine all functions--editingcompilingand execution--in single window for use with the example programs in this booksuch development systems should be able to handle java version or later many example programs (specificallythose that include user inputcannot be compiled with products designed for earlier versions of java (howeverwith minor modificationssome of which are mentioned in "overview,the java files can be made to compile with older development systems
23,243
further reading in this appendix data structures and algorithms object-oriented in this appendix we'll mention some books on various aspects of software developmentincluding data structures and algorithms this is subjective listthere are many other excellent titles on all the topics mentioned data structures and algorithms the definitive reference for any study of data structures and algorithms is the art of computer programming by donald knuthof stanford university (addison wesley this seminal workoriginally published in the sis now in its third edition it consists of three volumes (recently made available in boxed set)volume fundamental algorithmsvolume seminumerical algorithmsand volume sorting and searching of thesethe last is the most relevant to the topics in this book this work is highly mathematical and does not make for easy readingbut it is the bible for anyone contemplating serious research in the field somewhat more accessible text is robert sedgewick' algorithms in +(addison wesley this book is adapted from the earlier algorithms (addison wesley in which the code examples were written in pascal it is comprehensive and authoritative the text and code examples are quite compactand require close reading algorithms in java by robert sedgewick and michael schidlowsky (addison wesley covers the ground in java good text for an undergraduate course in data structures and algorithms is data abstraction and problem solving with ++walls and mirrors by janet prichard and frank carrano (benjamin cummings there are many illustrationsand the end with exercises and programming languages object-oriented design (oodand software engineering
23,244
appendix further reading projects the java version is data abstraction and problem solving with javawalls and mirrors by frank carrano and janet prichard practical algorithms in +by bryan flamig (john wiley and sons covers many of the usual topics in addition to some not frequently covered by other bookssuch as algorithm generators and string searching programming pearls by jon louis bentley (addison wesley was originally written in but is nevertheless stuffed full of great advice for the programmer much of the material deals with data structures and algorithms some other worthwhile texts on data structures and algorithms are classic data structures in +by timothy budd (addison wesley )data structures and problem solving using +by mark allen weiss (addison wesley )and data structures using and +by yedidyah langsam et al (prentice hall object-oriented programming languages for an accessible and thorough introduction to java and object-oriented programmingtry object-oriented programming in java by stephen gilbert and bill mccarty (waite group press if you're interested in ++try object-oriented programming in ++fourth editionby robert lafore (sams publishing the java programming languagethird editionby ken arnoldjames goslingand david holmes (addison wesley deals with java syntax and is certainly authoritative (although briefer than many books)goslingwho works at sun microsystemsis the creator of java core java fifth editionby cay horstmann and gary cornell (prentice hall is multivolume series that covers in depth but very accessibly almost everything you want to know about programming in java object-oriented design (oodand software engineering for an easynon-academic introduction to software engineeringtry the object primerthe application developer' guide to object-orientationsecond editionby scott ambler (cambridge university press this short book explains in plain language how to design large software application the title is bit of misnomerit goes way beyond mere oo concepts object-oriented design in java by stephen gilbert and bill mccarty (waite group press is an unusually accessible text
23,245
classic in the field of ood is object-oriented analysis and design with applications by grady booch (addison wesley the author is one of the pioneers in this field and the creator of the booch notation for depicting class relationships this book isn' easy for beginnersbut is essential for more advanced readers an early book on ood is the mythical man-month by frederick brooksjr (addison wesley reprinted in )which explains in very clear and literate way some of the reasons why good software design is necessary it is said to have sold more copies than any other computer book other good texts on ood are an introduction to object-oriented programmingthird editionby timothy budd (addison wesley )object-oriented design heuristics by arthur riel (addison wesley )and design patternselements of reusable objectoriented software by erich gamma et al (addison wesley
23,246
answers to questions in this appendix overview arrays simple sorting stacks and queues overview answers to questions linked lists recursion insertsearch fordelete advanced sorting sorting binary trees red-black trees search key trees and external storage hash tables heaps method graphs dot weighted graphs data types arrays answers to questions true false new
23,247
appendix answers to questions interface raising to power false constant objects simple sorting answers to questions comparing and swapping (or copying false false false three items with indices less than or equal to outer are sorted copies items with indices less than outer are partially sorted
23,248
stacks and queues answers to questions last-in-first-outand first-in-first-out false it' the other way around it doesn' move at all false they take ( time ( true yesyou would need method to find the minimum value linked lists answers to questions first current next=null java' garbage collection process destroys it
23,249
appendix answers to questions empty linked list onceif the links include previous reference double-ended list usuallythe list they both do push(and pop(in ( timebut the list uses memory more efficiently recursion answers to questions false "ed divide-and-conquer the range of cells to search the number of disks to transfer stack
23,250
advanced sorting answers to questions false ( *logn) ( pivot true partitioning the resulting subarrays pivot log true binary trees answers to questions (logn true nodetree
23,251
appendix answers to questions finding aa' left-child descendents * + false compress red-black trees answers to questions in order (or inverse order false rotationschanging the colors of nodes red left childright child nodeits two children true true
23,252
trees and external storage answers to questions balanced false the root is split color flip (logn many true hash tables answers to questions ( hash function linear probing linked list
23,253
appendix answers to questions true the array size false false the same block heaps answers to questions both the right and left children have keys less than (or equal tothe parent root array (or linked list up one graphs answers to questions edgenodes (or vertices count the number of and divide by (assuming the identity diagonal is all
23,254
node :bb: --> -->dc:bd: tree no true directed acyclic graph noby definition some vertices remainbut none have no successors weighted graphs answers to questions edges false the lowest-weight (cheapestedge is already the destination of an edge with lower weight false true warshall' algorithm
23,255
appendix answers to questions adjacency matrix nwhere is the number of squares on the board minus no
23,256
symbols (dot operator) (assignment operator) =(equality operator) (modulo operator)hashing - trees implementation - node splits - trees - balance efficiency speed storage requirements experiments insertion java code dataitem class node class tree class tree app class node splits - nodes per organization red-black trees operational equivalence - transforming root splits - searching
23,257
trees tree workshop applet fill button find button partition - efficiency - stopping/swapping - ins button quicksort - zoom button - recursion tree java recursive (towers of hanoi) stable all-pairs shortest-path problemweighted graphs - anagram java abstract data type abstraction adts class interfaces access modifiers accessingarray elements addedge(method adjacency (graphs) adjacency list adjacency matrix directed graphs adjust_spath(method adts (abstract data types) abstraction data types design tool interfaces lists queue implementation - stack implementation algorithms big notation depth-first search (graphs) dijkstra' intractable problems - minimum spanning trees for weighted graphs anagrams - applets bubble sort workshop draw button new button run button size button step button graphd workshop - graphdw workshop - graphn workshop bfs dfs minimum spanning trees graphw workshop - hash workshop - hashchain workshop buckets deletion duplicates load factors table size hashdouble workshop - heap workshop change fill
23,258
insert start button remove text messages insertsort workshop selectsort workshop bars shellsort - bars stack workshop linklist workshop new deleting peek find pop inserting push sorted lists size mergesort - towers ordered workshop tree workshop - binary search tree workshop linear search fill button partition workshop - find button priorityq workshop ins button delete zoom button - implementation appletviewer utility insert argumentsjava - peek/new arithmetic expressionsparsing queue workshop empty/full evaluating postfix - - insert infix to postfix notation - - - new postfix notation peek arithmetic operatorsbinary trees remove array index out of bounds exception quicksort workshop - array java quicksort workshop array workshop applet - rbtree workshop arrays clicking on nodes accessing elements del button big notation find button binary search flip button constants ins button insertion in unordered array / button linear search rol button binary trees - ror button choosing how can we make this index more usefulemail us at indexes@samspublishing com
23,259
arrays creating example (array java) - deletion -trees (external storage) displaying choosing insertion efficiency - program organization insertion - searching one block per node hashing dictionary example searching balanced trees heap java avl holes choosing initializing bank java code listing - internalexternal storage - bankaccount class linear probing constructor logarithms - public/private keywords mergesort - bankapp class quadratic probes base classes shellsort mergesort spath bfs java stack workshop applet big notation storing objects binary search versus linked lists constants workshop applet - general-purpose storage structures deletion - insertion in unordered array duplicate keys - linear search insertion sorting algorithms searching special-purpose data structures see also hashing binary searches ascending-priority queue big notation assignmentjava findmethod - assignment operator (=) logarithms atend(method recursion autoexec batpath statement (sdk) avl trees algorithms loops - binary search trees
23,260
binary trees tree java analogy workshop applet - as arrays - binary treesnodes choosing binarysearch java duplicate keys black height efficiency heaps color flips blocks - huffman code - full java code hashing and external storage - node class insertion tree class sorting external files treeapp class booksresource - max/min values bottom-up insertion nodes braces {} deleting - brackets [] finding - brackets java inserting - bracketsapp class purpose breadth-first search - linked list searching bfs java ordered array insertion graphn applet terms java code child bubble sort keys efficiency - leaf invariants levels java code - parent workshop applet - path bubblesort(method root buckets subtree traversing visiting traversing -node +(java for +programmers) inorder change(methodheaps and java code charat(method preorder/postorder check(methodbracketsapp class workshop applet how can we make this index more usefulemail us at indexes@samspublishing com
23,261
child (binary treeschild (binary trees) children tree inserting trees searching red-black treesnull children splitting unbalanced trees chng key circular queues classes tree app treeapp vertex clustering bankaccount open addressing bankapp quadratic probes data types dataitem distpar dividing programs into - collisions hashing and - hashing efficiency color flips graph - -node instances -node interfaces - abstraction accomplishments color rules iterator colorsswitching higharray compareto(method linklist comparisons listiterator balanced trees lowarray lexicographical lowarrayapp mergesort ordarray compilersreferences node - compiling example programs heap java compressionhuffman code - params connected graphs person connectivitydirected graphs - postfixapp console mode reverser constantsbig notation stacktriangleapp constructorsbankaccount class stackx container classes methods tree copies (mergesort) cutoff point cycles (topological sorting) -
23,262
links delete(method data records data storage data structures find(method nodes red-black trees characteristics with no children general purpose with one child - arrays with two children - balanced trees priorityq workshop applet big notation separate chaining binary search trees hash tables vertices delimiter matching - - libraries brackets java - linked lists opening delimiters - speed - special purpose depth-first search - dfs java big notation graphn applet priority queues java code - queues stacks deques doubly linked lists data typesadts derived classes databases - descending-priority queue dataitem class destination vertices del button development systems rbtree workshop delete(method dfs(method dfs java higharray class dictionaryhashing - linear probing dijkstra' algorithm links key points deleting directed graphs array data - duplicate keys - example (array java) doubly linked lists linklist workshop applet connectivity in - topological sorting - cycles/trees graphd applet - disk drives -treesinsertion reading how can we make this index more usefulemail us at indexes@samspublishing com
23,263
display(method display(methodhigharray class displayforward(methoddoubly linked list displaylist(methoditerators edges displaynode(method adding displayperson(method minimum spanning trees displaystack(method minimum spanning trees for weighted graphs displayword(method distpar class divide-and-conquer approach doanagram(method doparse(method dotrans(method dot operator ) double-ended lists - double data typeinput double hashing hashdouble applet java code doubly linked lists - deletion - - deques insertion - traversal doublylinked java - duplicate keys binary trees red-black trees duplicates arrays deletion - insertion searching open addressing priority queuesminimum spanning trees for weighted graphs paths weighted graphs efficiencyweighted graphs equality operator (==) equations bubble sort insert sort logarithms - error handlingstacks eulerleonardgraphs exceptionsarray index out of bounds experimentstrees explicit pointers extended classes external hashing external storage -trees efficiency - insertion - one block per node searching choosing -trees hashing indexed files sequential virtual memory external data separate chaining one block at time versus no duplicates slow access
23,264
hashing finding full blocks linklist workshop applet non-full blocks links table of file pointers delete(method indexing find(method file in memory nodes multiple efficiency searching java code - too large workshop applet - search criteria firstlastlist java sequential ordering - flip buttonrbtree workshop sorting external files floating-pointsinput internal arrays - foldinghashing and internal sort for loopsbubblesort java merging functions hash functions objects procedural languages factorial(method anagramming factorials - garbage collection fields linked lists fill button general-purpose data storage structuresbig notation general-purpose data structures fill key find button find buttonrbtree workshop find(method - higharray class linear probing arrays balanced trees binary search trees hash tables libraries speed - links recursion tree class tree class getadjunvisitedvertex(method getelem(methodlowarrayapp class getint(method getiterator(method how can we make this index more usefulemail us at indexes@samspublishing com
23,265
getmin(method getmin(method traces getsuccessor(method trees graph class - vertices graphdw workshop applet - adding shortest-path array deleting graphn workshop applet weighted dfs minimum spanning trees graphs - see also weighted graphs graphw workshop applet - guess- -number game adjacency choosing connected critical path analysis directed connectivity and - hamiltonian cycles hash functions edgesadding computation graph class - folding lines hashing strings maze analogy non-random keys - minimum spanning trees graphn applet java code - mst java random keys hash tables choosing hash javalinear probing - pins hashchain java representing in apps hashdouble workshop applet adjacency list hashfunc(method adjacency matrix hashfunc (method edges hashfunc (method vertices hashfunc (method searching hashing - breadth-first search - choosing depth-first search - collisions - sparse dictionary - topological sorting - efficiency cycles/trees open addressing - graphd applet - open addressing versus separate chaining java code -
23,266
separate chaining insert employee numbers (keys) external remove heapsort external storage - efficiency open addressing recursion double hashing - heapsort java linear probing - - higharray class quadratic probing higharray java separate chaining hoarec hashchain applet - holes java code horner' method heap java huffman code - arrayexpanding array size insertion - key change listing implementation removal - - trees - heapify(method priorityq workshop applet heaps queues - expansion red-black trees insertion - increments java code incrementsize(method array expansion indexed fileschoosing array size indexes insertion - arrays key change hash tables removal - indexing (external storage) priority queues file in memory removal multiple swapped - searching tree-based - too large weakly ordered induction workshop applet infix notation change binary trees fill translating to postfix - - how can we make this index more usefulemail us at indexes@samspublishing com
23,267
infix notation java code - nodes operatorssaving on stack heaps - translation rules - java code infix java inheritance red-black trees - workshop applet initialization lists priorityq workshop applet initializing arrays queue workshop applet inorder traversal (binary trees) separate chaining inorder(method sequential ordering input tree class character insertion sort - floating-point efficiency integers insertsort workshop applet strings - invariants ins button rbtree workshop java code partial ins key partitionssmall insert(method shellsort binary searches insertsort program binary trees inside grandchild heaps rotations higharray class instantiating linear probing int typedata types priority queues integersinput queues interfaces sortedlist java tree class insertafter(method doubly linked lists inserting trees array data duplicate keys example (array java) adts classes - interiterator java internal arraysexternal storage - interval sequence shellsort intractable problemsalgorithms - invariants bubble sort -trees - insertion sort doubly linked lists selection sort linklist workshop applet
23,268
iterators infixconverting to postfix - atend(method input classesmethods character interiterator java floating-point iterator class integers methods strings - operations library data structures references methodsmain() minimum spanning trees - minimum spanning trees for weighted graphs - new operator nodes java code trees dataitem class node class tree class tree java tree app class binary trees node class nodes with one child traversing tree class treeapp class brackets java breadth-first search for +programmers depth-first search - double hashing heap java array expansion array size insertion - key change removal - finding - inserting - output overloaded operators pointers arguments assignment equality/identity new operator references - postfix expressionsevaluating primitive variable types priority queue - queue separate chaining shellsort - shortest-path problem getmin(method path(method - spath array/distpar class sorted lists - stacks topological sorting - java interpreter how can we make this index more usefulemail us at indexes@samspublishing com
23,269
key values hash java - insert(method key valuessorted lists keys binary trees double hashing linear searches big notation lines graphs hash functionsrandom keys link classdoubly linked lists hashing linked lists heap java keywords adjacency lists (graphs) binary trees private choosing public double-ended - knapsack problemrecursion - knuthinterval sequence doubly - deletion - - deques insertion - traversal efficiency last-in-first-out (lifo) iterators leaf (binary trees) atend(method leafs interiterator java trees iterator class recordsb-tree methods levels (binary trees) operations lexicographical comparisons references librariesgeneral-purpose data structures linklist class library data structures linklist workshop applet linear probing deleting clustering find duplicates inserting hash workshop applet - linklist java hashing efficiency links java code basic types - array expansion finding/deleting delete(method delete(method find(method find(method
23,270
references - towers java separate chaining tree java versus arrays tree java listings triangle java anagram java listinsertionsort java array java listiterator class binarysearch java lists brackets java adts doublylinked java sorted firstlastlist java efficiency hashchain java insertion sort - heap java java code - heapsort java linklist workshop applet higharray java sortedlist java - infix java load factors interiterator java logarithms linklist java equation - linklist java inverse equation listinsertionsort java lowarray java shellsort loops merge java bfs(method (graph class) mergesort java recursion - orderedarray java triangular number nth term partition java lowarray class path java lowarray java postfix java lowarrayapp class priorityq java queue java quicksort java quicksort java quicksort java main(method reverse java anagram java shellsort java array java sortedlist java bracketsapp class stacktriangle java classdataarray java stacktriangle java infixapp class how can we make this index more usefulemail us at indexes@samspublishing com
23,271
main(method linklist java bubblesort() linkstack class change() lowarray class charat() lowarrayapp class compareto() postfixapp class delete(stack java higharray class manualsort(method linear probing mathematical induction (recursion) links maze analogy deques medianquicksort pivot value dfs() median-of-three partitioning display()higharray class medianof (method displayforward() memory displaylistiterators blocks - displaynode() index files displayperson() swapsselection sort displaystack() merge(method displayword() merge java doanagram() mergesort doparse() comparisons dotrans() copies factorial() efficiency - find(external files binary searches mergesort java - higharray class sorted arrays - linear probing sorting by merging - links workshop applet - recursion mergesort(method tree class mergesort java - getadjunvisitedvertex() mergingexternal storage getelem()lowarrayapp class methods getint() accessing getiterator() addedge() getmin() adjust_spath() getsuccessor() atend() hashfunc() bankaccount class hashfunc ()
23,272
hashfunc () stackx class hashfunc () pop()stackx class heapify() preorder() incrementsize() priorityq class inorder() priorityqueue class insert(push()stackx class heaps putinpq() higharray class quicksort() linear probing readstring() priority queues recfind() queues recmergesort() insertafter() recquicksort() doubly linked lists iterator iterator class main() recursion simulating - remove(priority queues infixapp class queues linklist java rotate() linkstack selectsort() lowarrayapp class - shellsort() postfixapp class size() stack java split) manualsort() stackx class medianof () sumallcolumns() merge() sumremainingcolumns() mergesort swap() mst() topo() mstw() triangle() nosuccessors() trickledown() parseint() trickleup() partitionin()pivot values minimum spanning trees partitionit() graphn applet path() - java code - peek(mst java queue workshop applet queues weighted graphs algorithm graphw applet - how can we make this index more usefulemail us at indexes@samspublishing com
23,273
minimum spanning trees java code - mstw java finding efficiency modulo operator (%)hashing java code - moore' law workshop applet - mst(method mst java mstw(method mstw java heaps insertion - removal inserting java code - workshop applet key values -sorting - levels leaf navigating trees number required new button path new operator parent arrays red-black trees java clicking node verticesadding to graphs deleting node class heap java node splits - trees - nodes transformation to red-black trees inserting nodes per inserting - weird crossover root color flip splits -node -node -trees splitting splitting trees) viewing subtrees binary trees swappingheaps - child two childrendeleting deleting visiting with no children see also red-black trees with one child - nosuccessors(method with two children -
23,274
quadratic probing hashdouble applet objects step accessing methods operators arrays assignment (=) classes dot ) comp to nodes equality (==) creating new sorting arrays java code overloaded lexicographical comparisons stability storing saving on stack ordarray class ordered arrays classdataarray java - advantages person class binary search oop (object-oriented programming) binary trees bank java - find(method inheritance linear search objects - polymorphism ordarray class ordered workshop applet procedural languages - binary searchguess- -number game open addressing double hashing linear search orderedarray java - hashdouble applet output java code outside grandchildrotations hashing efficiency - overloading operators linear probe java code array expansion delete(method find(method hash java - package access insert(method params class linear probing parent (binary trees) clustering parent vertices duplicates parentheses () hash workshop applet - how can we make this index more usefulemail us at indexes@samspublishing com
23,275
parsing arithmetic expressions parsing arithmetic expressions evaluating postfix - - popping parsing arithemetic expressions infix to postfix notation - - postal analogy (stacks) - stack workshop applet postfix notation postfix notation parseint(method evaluating - - partition java infixtranslating to postfix - - partitioning java code - partition algorithm - operatorssaving on stack partition java - translation rules - small partitions postfix java workshop applet postfixapp class partitionit(method powers of two pivot values path(method preemptive multitasking operating systempriority queues path java prefix notation paths (binary trees) preorder(method peek(method previous fielditerators queue workshop applet prime numbersdouble hashing queues primitive types stackx class peeking adts arrays priorityq workshop applet primitive variable types queue workshop applet priority queues - stack workshop applet choosing person class efficiency pins heaps pivot valuesquicksort - - java priority queue pointersjava minimum spanning trees for weighted graphs arguments assignment priorityq workshop applet - equality/identity priorityq classmstw java new operator priorityq workshop applet references - delete polymorphism implementation pop(methodstackx class insert peek/new
23,276
priorityq java priority queues priorityqueue classmethods workshop applet - private keywordbankaccount class quicksort procedural languages algorithm - modeling efficiency - organizational units (nsquaredperformance - problems partitioning programs partition algorithm - data storage structures partition java - dividing into classes workshop applet - public keywordbankaccount class partitionssmall push(methodstackx class pivot value - - pushing quicksort workshop applet - putinpq(method quicksort workshop applet quicksort java recursionremoving three partitioningmedian - quicksort(method quadratic probing quicksort java hashdouble applet quicksort java step quicksort java queue java queues - breadth-first search choosing circular radix sort deques algorithm efficiency efficiency examples program design implementingadts - radix-exchange sort java code railroad line - implementation - range conversion insert(method rangespowers of two peek(method / buttonrbtree workshop remove(method rbtree workshop applet - readstring(method how can we make this index more usefulemail us at indexes@samspublishing com
23,277
real-world data real-world data triangular numbers recfind(method efficiency recmergesort(method mathematical induction records methods -trees nth term with loop hashing nth term with recursion - memory blocks searching recquicksort(method rectriangle(methodstacktriangle java recursion triangle java - red-black trees trees operational equivalence transforming anagrams - balance applications for - colored nodes binary search duplicate keys algorithms efficiency loops - experiment combinations - experiment eliminating experiment simulating method - - experiment stacks implementing factorials heapsort inorder traversal (binary trees) nodes deleting inserting - knapsack problem - null children mergesort red-black rules efficiency - rotations - mergesort java - subtrees sorted arrays weird crossover node sorting by merging - rules and balanced trees workshop applet top-down insertion removing from quicksort towers of hanoi references algorithmsspeed of recursive algorithm java - subtrees linked lists towers workshop applet towers java links - rem button
23,278
removalheap java - binaryguess- -number game remove(method external storagesearch criteria priority queues queues removingqueue workshop applet graphs breadth-first search - depth-first search - resources - indexing reverse java linear reverser class separate chaining rol buttonrbtree workshop sequential ordering - root tree class color flip splitting trees) - selection sort - efficiency root (binary trees) invariant ror buttonrbtree workshop java code rotate(method selectsort workshop applet rotating selectsort(method rotations separate chaining inside grandchild hashchain applet outside grandchild buckets red-black trees - deletion experiment duplicates subtrees load factors weird crossover node rulesred-black runtime errorsarray null objects table size hashing efficiency java code sequential ordering block insertion external storage - searching - sdk shelldonald searching shellsort - trees diminishing gaps array data efficiency example (array java) interval sequences arraysduplicate keys java code - -trees -sorting - workshop applet - how can we make this index more usefulemail us at indexes@samspublishing com
23,279
23,280
23,281
preface xv programminga general overview what' this book about mathematics review exponents logarithms series modular arithmetic the word brief introduction to recursion + classes basic class syntax extra constructor syntax and accessors separation of interface and implementation vector and string +details pointers lvaluesrvaluesand references parameter passing return passing std::swap and std::move the big-fivedestructorcopy constructormove constructorcopy assignment operator=move assignment operator -style arrays and strings templates function templates class templates objectcomparableand an example function objects separate compilation of class templates using matrices the data membersconstructorand basic accessors operator[ vii
23,282
contents big-five summary exercises references algorithm analysis mathematical background model what to analyze running-time calculations simple example general rules solutions for the maximum subsequence sum problem logarithms in the running time limitations of worst-case analysis summary exercises references listsstacksand queues abstract data types (adts the list adt simple array implementation of lists simple linked lists vector and list in the stl iterators exampleusing erase on list const_iterators implementation of vector implementation of list the stack adt stack model implementation of stacks applications the queue adt queue model array implementation of queues applications of queues summary exercises
23,283
trees preliminaries implementation of trees tree traversals with an application binary trees implementation an exampleexpression trees the search tree adt--binary search trees contains findmin and findmax insert remove destructor and copy constructor average-case analysis avl trees single rotation double rotation splay trees simple idea (that does not work splaying tree traversals (revisited -trees sets and maps in the standard library sets maps implementation of set and map an example that uses several maps summary exercises references hashing general idea hash function separate chaining hash tables without linked lists linear probing quadratic probing double hashing rehashing hash tables in the standard library ix
23,284
contents hash tables with worst-case ( access perfect hashing cuckoo hashing hopscotch hashing universal hashing extendible hashing summary exercises references priority queues (heaps model simple implementations binary heap structure property heap-order property basic heap operations other heap operations applications of priority queues the selection problem event simulation -heaps leftist heaps leftist heap property leftist heap operations skew heaps binomial queues binomial queue structure binomial queue operations implementation of binomial queues priority queues in the standard library summary exercises references sorting preliminaries insertion sort the algorithm stl implementation of insertion sort analysis of insertion sort lower bound for simple sorting algorithms
23,285
shellsort worst-case analysis of shellsort heapsort analysis of heapsort mergesort analysis of mergesort quicksort picking the pivot partitioning strategy small arrays actual quicksort routines analysis of quicksort linear-expected-time algorithm for selection general lower bound for sorting decision trees decision-tree lower bounds for selection problems adversary lower bounds linear-time sortsbucket sort and radix sort external sorting why we need new algorithms model for external sorting the simple algorithm multiway merge polyphase merge replacement selection summary exercises references the disjoint sets class equivalence relations the dynamic equivalence problem basic data structure smart union algorithms path compression worst case for union-by-rank and path compression slowly growing functions an analysis by recursive decomposition an om log bound an om (mnbound an application xi
23,286
contents summary exercises references graph algorithms definitions representation of graphs topological sort shortest-path algorithms unweighted shortest paths dijkstra' algorithm graphs with negative edge costs acyclic graphs all-pairs shortest path shortest path example network flow problems simple maximum-flow algorithm minimum spanning tree prim' algorithm kruskal' algorithm applications of depth-first search undirected graphs biconnectivity euler circuits directed graphs finding strong components introduction to np-completeness easy vs hard the class np np-complete problems summary exercises references algorithm design techniques greedy algorithms simple scheduling problem huffman codes approximate bin packing divide and conquer running time of divide-and-conquer algorithms closest-points problem
23,287
the selection problem theoretical improvements for arithmetic problems dynamic programming using table instead of recursion ordering matrix multiplications optimal binary search tree all-pairs shortest path randomized algorithms random-number generators skip lists primality testing backtracking algorithms the turnpike reconstruction problem games summary exercises references amortized analysis an unrelated puzzle binomial queues skew heaps fibonacci heaps cutting nodes in leftist heaps lazy merging for binomial queues the fibonacci heap operations proof of the time bound splay trees summary exercises references advanced data structures and implementation top-down splay trees red-black trees bottom-up insertion top-down red-black trees top-down deletion treaps xiii
23,288
contents suffix arrays and suffix trees suffix arrays suffix trees linear-time construction of suffix arrays and suffix trees - trees pairing heaps summary exercises references appendix separate compilation of class templates everything in the header explicit instantiation index
23,289
purpose/goals the fourth edition of data structures and algorithm analysis in +describes data structuresmethods of organizing large amounts of dataand algorithm analysisthe estimation of the running time of algorithms as computers become faster and fasterthe need for programs that can handle large amounts of input becomes more acute paradoxicallythis requires more careful attention to efficiencysince inefficiencies in programs become most obvious when input sizes are large by analyzing an algorithm before it is actually codedstudents can decide if particular solution will be feasible for examplein this text students look at specific problems and see how careful implementations can reduce the time constraint for large amounts of data from centuries to less than second thereforeno algorithm or data structure is presented without an explanation of its running time in some casesminute details that affect the running time of the implementation are explored once solution method is determineda program must still be written as computers have become more powerfulthe problems they must solve have become larger and more complexrequiring development of more intricate programs the goal of this text is to teach students good programming and algorithm analysis skills simultaneously so that they can develop such programs with the maximum amount of efficiency this book is suitable for either an advanced data structures course or first-year graduate course in algorithm analysis students should have some knowledge of intermediate programmingincluding such topics as pointersrecursionand object-based programmingas well as some background in discrete math approach although the material in this text is largely language-independentprogramming requires the use of specific language as the title implieswe have chosen +for this book +has become leading systems programming language in addition to fixing many of the syntactic flaws of cc+provides direct constructs (the class and templateto implement generic data structures as abstract data types the most difficult part of writing this book was deciding on the amount of +to include use too many features of +and one gets an incomprehensible textuse too few and you have little more than text that supports classes the approach we take is to present the material in an object-based approach as suchthere is almost no use of inheritance in the text we use class templates to describe generic data structures we generally avoid esoteric +features and use the vector and string classes that are now part of the +standard previous editions have implemented class templates by separating the class template interface from its implementation although this is arguably the preferred approachit exposes compiler problems that have made it xv
23,290
preface difficult for readers to actually use the code as resultin this edition the online code represents class templates as single unitwith no separation of interface and implementation provides review of the +features that are used throughout the text and describes our approach to class templates appendix describes how the class templates could be rewritten to use separate compilation complete versions of the data structuresin both +and javaare available on the internet we use similar coding conventions to make the parallels between the two languages more evident summary of the most significant changes in the fourth edition the fourth edition incorporates numerous bug fixesand many parts of the book have undergone revision to increase the clarity of presentation in additionr includes implementation of the avl tree deletion algorithm-- topic often requested by readers has been extensively revised and enlarged and now contains material on two newer algorithmscuckoo hashing and hopscotch hashing additionallya new section on universal hashing has been added also new is brief discussion of the unordered_set and unordered_map class templates introduced in ++ is mostly unchangedhoweverthe implementation of the binary heap makes use of move operations that were introduced in ++ now contains material on radix sortand new section on lower-bound proofs has been added sorting code makes use of move operations that were introduced in ++ uses the new union/find analysis by seidel and sharir and shows the om (mnbound instead of the weaker om logn bound in prior editions adds material on suffix trees and suffix arraysincluding the linear-time suffix array construction algorithm by karkkainen and sanders (with implementationthe sections covering deterministic skip lists and aa-trees have been removed throughout the textthe code has been updated to use ++ notablythis means use of the new ++ featuresincluding the auto keywordthe range for loopmove construction and assignmentand uniform initialization overview contains review material on discrete math and recursion believe the only way to be comfortable with recursion is to see good uses over and over thereforerecursion is prevalent in this textwith examples in every except also includes material that serves as review of basic +included is discussion of templates and important constructs in +class design deals with algorithm analysis this explains asymptotic analysis and its major weaknesses many examples are providedincluding an in-depth explanation of logarithmic running time simple recursive programs are analyzed by intuitively converting them into iterative programs more complicated divide-and-conquer programs are introducedbut some of the analysis (solving recurrence relationsis implicitly delayed until where it is performed in detail
23,291
covers listsstacksand queues this includes discussion of the stl vector and list classesincluding material on iteratorsand it provides implementations of significant subset of the stl vector and list classes covers treeswith an emphasis on search treesincluding external search trees ( -treesthe unix file system and expression trees are used as examples avl trees and splay trees are introduced more careful treatment of search tree implementation details is found in additional coverage of treessuch as file compression and game treesis deferred until data structures for an external medium are considered as the final topic in several included is discussion of the stl set and map classesincluding significant example that illustrates the use of three separate maps to efficiently solve problem discusses hash tablesincluding the classic algorithms such as separate chaining and linear and quadratic probingas well as several newer algorithmsnamely cuckoo hashing and hopscotch hashing universal hashing is also discussedand extendible hashing is covered at the end of the is about priority queues binary heaps are coveredand there is additional material on some of the theoretically interesting implementations of priority queues the fibonacci heap is discussed in and the pairing heap is discussed in covers sorting it is very specific with respect to coding details and analysis all the important general-purpose sorting algorithms are covered and compared four algorithms are analyzed in detailinsertion sortshellsortheapsortand quicksort new to this edition is radix sort and lower bound proofs for selection-related problems external sorting is covered at the end of the discusses the disjoint set algorithm with proof of the running time this is short and specific that can be skipped if kruskal' algorithm is not discussed covers graph algorithms algorithms on graphs are interestingnot only because they frequently occur in practice but also because their running time is so heavily dependent on the proper use of data structures virtually all of the standard algorithms are presented along with appropriate data structurespseudocodeand analysis of running time to place these problems in proper contexta short discussion on complexity theory (including np-completeness and undecidabilityis provided covers algorithm design by examining common problem-solving techniques this is heavily fortified with examples pseudocode is used in these later so that the student' appreciation of an example algorithm is not obscured by implementation details deals with amortized analysis three data structures from and and the fibonacci heapintroduced in this are analyzed covers search tree algorithmsthe suffix tree and arraythe - treeand the pairing heap this departs from the rest of the text by providing complete and careful implementations for the search trees and pairing heap the material is structured so that the instructor can integrate sections into discussions from other for examplethe top-down red-black tree in can be discussed along with avl trees (in to provide enough material for most one-semester data structures courses if time permitsthen can be covered graduate course on algorithm analysis could cover to the advanced data structures analyzed in can easily be referred to in the earlier the discussion of np-completeness in xvii
23,292
preface is far too brief to be used in such course you might find it useful to use an additional work on np-completeness to augment this text exercises exercisesprovided at the end of each match the order in which material is presented the last exercises may address the as whole rather than specific section difficult exercises are marked with an asteriskand more challenging exercises have two asterisks references references are placed at the end of each generally the references either are historicalrepresenting the original source of the materialor they represent extensions and improvements to the results given in the text some references represent solutions to exercises supplements the following supplements are available to all readers at source code for example programs errata in additionthe following material is available only to qualified instructors at pearson instructor resource center (www pearsonhighered com/ircvisit the irc or contact your pearson education sales representative for access solutions to selected exercises figures from the book errata acknowledgments manymany people have helped me in the preparation of books in this series some are listed in other versions of the bookthanks to all as usualthe writing process was made easier by the professionals at pearson ' like to thank my editortracy johnsonand production editormarilyn lloyd my wonderful wife jill deserves extra special thanks for everything she does finallyi' like to thank the numerous readers who have sent -mail messages and pointed out errors or inconsistencies in earlier versions my website www cis fiu edu/~weiss will also contain updated source code (in +and java)an errata listand link to submit bug reports miamiflorida
23,293
programminga general overview in this we discuss the aims and goals of this text and briefly review programming concepts and discrete mathematics we will see that how program performs for reasonably large input is just as important as its performance on moderate amounts of input summarize the basic mathematical background needed for the rest of the book briefly review recursion summarize some important features of +that are used throughout the text what' this book aboutsuppose you have group of numbers and would like to determine the kth largest this is known as the selection problem most students who have had programming course or two would have no difficulty writing program to solve this problem there are quite few "obvioussolutions one way to solve this problem would be to read the numbers into an arraysort the array in decreasing order by some simple algorithm such as bubble sortand then return the element in position somewhat better algorithm might be to read the first elements into an array and sort them (in decreasing ordernexteach remaining element is read one by one as new element arrivesit is ignored if it is smaller than the kth element in the array otherwiseit is placed in its correct spot in the arraybumping one element out of the array when the algorithm endsthe element in the kth position is returned as the answer both algorithms are simple to codeand you are encouraged to do so the natural questionsthenarewhich algorithm is betterandmore importantis either algorithm good enougha simulation using random file of million elements and , , will show that neither algorithm finishes in reasonable amount of timeeach requires several days of computer processing to terminate (albeit eventually with correct answeran alternative methoddiscussed in gives solution in about second thusalthough our proposed algorithms workthey cannot be considered good algorithms
23,294
programminga general overview figure sample word puzzle because they are entirely impractical for input sizes that third algorithm can handle in reasonable amount of time second problem is to solve popular word puzzle the input consists of twodimensional array of letters and list of words the object is to find the words in the puzzle these words may be horizontalverticalor diagonal in any direction as an examplethe puzzle shown in figure contains the words thistwofatand that the word this begins at row column or ( , )and extends to ( , )two goes from ( , to ( , )fat goes from ( , to ( , )and that goes from ( , to ( , againthere are at least two straightforward algorithms that solve the problem for each word in the word listwe check each ordered triple (rowcolumnorientationfor the presence of the word this amounts to lots of nested for loops but is basically straightforward alternativelyfor each ordered quadruple (rowcolumnorientationnumber of charactersthat doesn' run off an end of the puzzlewe can test whether the word indicated is in the word list againthis amounts to lots of nested for loops it is possible to save some time if the maximum number of characters in any word is known it is relatively easy to code up either method of solution and solve many of the real-life puzzles commonly published in magazines these typically have rows columnsand or so words supposehoweverwe consider the variation where only the puzzle board is given and the word list is essentially an english dictionary both of the solutions proposed require considerable time to solve this problem and therefore might not be acceptable howeverit is possibleeven with large word listto solve the problem very quickly an important concept is thatin many problemswriting working program is not good enough if the program is to be run on large data setthen the running time becomes an issue throughout this book we will see how to estimate the running time of program for large inputs andmore importanthow to compare the running times of two programs without actually coding them we will see techniques for drastically improving the speed of program and for determining program bottlenecks these techniques will enable us to find the section of the code on which to concentrate our optimization efforts mathematics review this section lists some of the basic formulas you need to memorizeor be able to deriveand reviews basic proof techniques
23,295
exponents xa xb xa+ xa xa- xb (xa ) xab xn xn xn  + logarithms in computer scienceall logarithms are to the base unless specified otherwise definition xa if and only if logx several convenient equalities follow from this definition theorem loga logc logc abc  proof let logc by logc aand loga thenby the definition of logarithmscx bcy aand az combining these three equalities yields cx (cy ) thereforex yzwhich implies /yproving the theorem theorem log ab log log bab proof let log ay log band log ab thenassuming the default base of band ab combining the last three equalities yields ab thereforex zwhich proves the theorem some other useful formulaswhich can all be derived in similar mannerfollow log / log log log(ab log log log log for all log , log , ,
23,296
programminga general overview series the easiest formulas to remember are + = and the companionn ai = an+ - in the latter formulaif then ai < = - and as tends to the sum approaches /( athese are the "geometric seriesformulas we can derive the last formula for = ( in the following manner let be the sum then then as if we subtract these two equations (which is permissible only for convergent series)virtually all the terms on the right side cancelleaving as which implies that - we can use this same technique to compute = / sum that occurs frequently we write ** and multiply by obtaining ** subtracting these two equations yields = thuss **
23,297
another type of common series in analysis is the arithmetic series any such series can be evaluated from the basic formulan ii= ( for instanceto find the sum ( )rewrite it as ( ( )which is clearly ( )/ another way to remember this is to add the first and last terms (total )the second and next-to-last terms (total )and so on since there are / of these pairsthe total sum is ( )/ which is the same answer as before the next two formulas pop up now and then but are fairly uncommon = ( )( ik nk+ | = - when - the latter formula is not valid we then need the following formulawhich is used far more in computer science than in other mathematical disciplines the numbers hn are known as the harmonic numbersand the sum is known as harmonic sum the error in the following approximation tends to which is known as euler' constant hn = loge these two formulas are just general algebraic manipulationsn (nnf(ni= (ii= = (in - (ii= modular arithmetic we say that is congruent to modulo nwritten (mod )if divides intuitivelythis means that the remainder is the same when either or is divided by thus (mod as with equalityif (mod )then (mod nand ad bd (mod
23,298
programminga general overview oftenn is prime number in that casethere are three important theoremsfirstif is primethen ab (mod nis true if and only if (mod nor (mod nin other wordsif prime number divides product of two numbersit divides at least one of the two numbers secondif is primethen the equation ax (mod nhas unique solution (mod nfor all this solution nis the multiplicative inverse thirdif is primethen the equation (mod nhas either two solutions (mod nfor all nor it has no solutions there are many theorems that apply to modular arithmeticand some of them require extraordinary proofs in number theory we will use modular arithmetic sparinglyand the preceding theorems will suffice the word the two most common ways of proving statements in data-structure analysis are proof by induction and proof by contradiction (and occasionally proof by intimidationused by professors onlythe best way of proving that theorem is false is by exhibiting counterexample proof by induction proof by induction has two standard parts the first step is proving base casethat isestablishing that theorem is true for some small (usually degeneratevalue( )this step is almost always trivial nextan inductive hypothesis is assumed generally this means that the theorem is assumed to be true for all cases up to some limit using this assumptionthe theorem is then shown to be true for the next valuewhich is typically this proves the theorem (as long as is finiteas an examplewe prove that the fibonacci numbersf fi fi- fi- satisfy fi (some definitions have which shifts the series to do thiswe first verify that the theorem is true for the trivial cases it is easy to verify that / and / this proves the basis we assume that the theorem is true for kthis is the inductive hypothesis to prove the theoremwe need to show that fk+ ( / ) + we have fk+ fk fk- by the definitionand we can use the inductive hypothesis on the right-hand sideobtaining fk+ ( / ) ( / ) - ( / )( / ) + ( / ) ( / ) + ( / )( / ) + ( / )( / ) + which simplifies to
23,299
fk+ ( / / )( / ) + ( / )( / ) + ( / ) + proving the theorem as second examplewe establish the following theorem theorem if > then  = ( + )( + proof the proof is by induction for the basisit is readily seen that the theorem is true when for the inductive hypothesisassume that the theorem is true for < < we will establish thatunder this assumptionthe theorem is true for we have + = ( ) = applying the inductive hypothesiswe obtain + ( )( ( ) ( ( ( = ( )( )( ( thusn+ = ( )[( ][ ( proving the theorem proof by counterexample the statement fk < is false the easiest way to prove this is to compute proof by contradiction proof by contradiction proceeds by assuming that the theorem is false and showing that this assumption implies that some known property is falseand hence the original assumption was erroneous classic example is the proof that there is an infinite number of primes to prove thiswe assume that the theorem is falseso that there is some largest prime pk let pk be all the primes in order and consider