Genetic Algorithm Examples
On this page… |
---|
ImprovingYour Results PopulationDiversity FitnessScaling Selection ReproductionOptions Mutationand Crossover Settingthe Amount of Mutation Settingthe Crossover Fraction ComparingResults for Varying Crossover Fractions Example:Global vs. Local Minima with GA Usinga Hybrid Function Settingthe Maximum Number of Generations Vectorizingthe Fitness Function ConstrainedMinimization Using ga |
Improving Your Results
To get the best results from the genetic algorithm, you usuallyneed to experiment with different options. Selecting the bestoptions for a problem involves start and error. This sectiondescribes some ways you can change options to improve results. Fora complete description of the available options, see GeneticAlgorithm Options.
Backto Top
Population Diversity
One of the most important factors that determines theperformance of the genetic algorithm performs is thediversity of the population. If the averagedistance between individuals is large, the diversity is high; ifthe average distance is small, the diversity is low. Getting theright amount of diversity is a matter of start and error. If thediversity is too high or too low, the genetic algorithm might notperform well.
This section explains how to control diversity by setting theInitial range of the population. Settingthe Amount of Mutation describes how the amount of mutationaffects diversity.
This section also explains how to set the populationsize.
Example: Setting the Initial Range
By default, ga creates a random initial populationusing a creation function. You can specify the range of the vectorsin the initial population in the Initial range field inPopulation options.
Note The initialrange restricts the range of the points in theinitial population by specifying the lowerand upper bounds. Subsequent generations can contain points whoseentries do not lie in the initial range. Set upper and lower boundsfor all generations in the Bounds fields in theConstraints panel. |
If you know approximately where the solution to a problem lies,specify the initial range so that it contains your guess for thesolution. However, the genetic algorithm can find the solution evenif it does not lie in the initial range, if the population hasenough diversity.
The following example shows how the initial range affects theperformance of the genetic algorithm. The example uses Rastrigin'sfunction, described in Example:Rastrigin's Function. The minimum value of the function is 0,which occurs at the origin.
To run the example, open the ga solver in theOptimization Tool by entering optimtool('ga') at thecommand line. Set the following:
Click Start in Run solver and view results.Although the results of genetic algorithm computations are random,your results are similar to the following figure, with a bestfitness function value of approximately 2.
The upper plot, which displays the best fitness at eachgeneration, shows little progress in lowering the fitness value.The lower plot shows the average distance between individuals ateach generation, which is a good measure of the diversity of apopulation. For this setting of initial range, there is too littlediversity for the algorithm to make progress.
Next, try setting Initial range to [1;100] andrunning the algorithm. This time the results are more variable. Youmight obtain a plot with a best fitness value of 35, as in thefollowing plot. You might obtain different results.
This time, the genetic algorithm makes progress, but because theaverage distance between individuals is so large, the bestindividuals are far from the optimal solution.
Finally, set Initial range to [1;2] and run thegenetic algorithm. Again, there is variability in the result, butyou might obtain a result similar to the following figure. Run theoptimization several times, and you eventually obtain a final pointnear [0;0], with a fitness function value near0.
The diversity in this case is better suited to the problem, soga usually returns a better result than in the previoustwo cases.
Example: Linearly Constrained Population and Custom PlotFunction
This example shows how the default creation function forlinearly constrained problems, gacreationlinearfeasible,creates a well-dispersed population that satisfies linearconstraints and bounds. It also contains an example of a customplot function.
The problem uses the objective function inlincontest6.m, a quadratic:
To see code for the function, enter type lincontest6 atthe command line. The constraints are three linearinequalities:
x1 + x2 ≤ 2,
–x1 + 2x2 ≤ 2,
2x1 + x2 ≤ 3.
Also, the variables xi are restricted to bepositive.
Create a custom plot function file by cutting and pasting thefollowing code into a new function file in the MATLAB Editor:
function state = gaplotshowpopulation2(unused,state,flag,fcn)% This plot function works in 2-d onlyif size(state.Population,2) > 2 return;endif nargin < 4 % check to see if fitness function exists fcn = [];end% Dimensions to plotdimensionsToPlot = [1 2];switch flag % Plot initialization case 'init' pop = state.Population(:,dimensionsToPlot); plotHandle = plot(pop(:,1),pop(:,2),'*'); set(plotHandle,'Tag','gaplotshowpopulation2') title('Population plot in two dimension',... 'interp','none') xlabelStr = sprintf('%s %s','Variable ',... num2str(dimensionsToPlot(1))); ylabelStr = sprintf('%s %s','Variable ',... num2str(dimensionsToPlot(2))); xlabel(xlabelStr,'interp','none'); ylabel(ylabelStr,'interp','none'); hold on; % plot the inequalities plot([0 1.5],[2 0.5],'m-.') % x1 + x2 <= 2 plot([0 1.5],[1 3.5/2],'m-.'); % -x1 + 2*x2 <= 2 plot([0 1.5],[3 0],'m-.'); % 2*x1 + x2 <= 3 % plot lower bounds plot([0 0], [0 2],'m-.'); % lb = [ 0 0]; plot([0 1.5], [0 0],'m-.'); % lb = [ 0 0]; set(gca,'xlim',[-0.7,2.2]) set(gca,'ylim',[-0.7,2.7]) % Contour plot the objective function if ~isempty(fcn) % if there is a fitness function range = [-0.5,2;-0.5,2]; pts = 100; span = diff(range')/(pts - 1); x = range(1,1): span(1) : range(1,2); y = range(2,1): span(2) : range(2,2); pop = zeros(pts * pts,2); values = zeros(pts,1); k = 1; for i = 1:pts for j = 1:pts pop(k,:) = [x(i),y(j)]; values(k) = fcn(pop(k,:)); k = k + 1; end end values = reshape(values,pts,pts); contour(x,y,values); colorbar end % Pause for three seconds to view the initial plot pause(3); case 'iter' pop = state.Population(:,dimensionsToPlot); plotHandle = findobj(get(gca,'Children'),'Tag',... 'gaplotshowpopulation2'); set(plotHandle,'Xdata',pop(:,1),'Ydata',pop(:,2));end
The custom plot function plots the lines representing the linearinequalities and bound constraints, plots level curves of thefitness function, and plots the population as it evolves. This plotfunction expects to have not only the usual inputs(options,state,flag), but also a function handle to thefitness function, @lincontest6 in this example. Togenerate level curves, the custom plot function needs the fitnessfunction.
At the command line, enter the constraints as a matrix andvectors:
A = [1,1;-1,2;2,1]; b = [2;2;3]; lb = zeros(2,1);
Set options to use gaplotshowpopulation2, andpass in @lincontest6 as the fitness function handle:
options = gaoptimset('PlotFcns',... {{@gaplotshowpopulation2,@lincontest6}});
Run the optimization using options:
[x,fval] = ga(@lincontest6,2,A,b,[],[],lb,[],[],options);
A plot window appears showing the linear constraints, bounds,level curves of the objective function, and initial distribution ofthe population:
You can see that the initial population is biased to lie on theconstraints.
The population eventually concentrates around the minimumpoint:
Setting the Population Size
The Population size field in Population optionsdetermines the size of the population at each generation.Increasing the population size enables the genetic algorithm tosearch more points and thereby obtain a better result. However, thelarger the population size, the longer the genetic algorithm takesto compute each generation.
Note You should setPopulation size to be at least the value of Number ofvariables, so that the individuals in each population span thespace being searched. |
You can experiment with different settings for Populationsize that return good results without taking a prohibitiveamount of time to run.
Backto Top
Fitness Scaling
Fitness scaling converts the raw fitness scores that arereturned by the fitness function to values in a range that issuitable for the selection function. The selection function usesthe scaled fitness values to select the parents of the nextgeneration. The selection function assigns a higher probability ofselection to individuals with higher scaled values.
The range of the scaled values affects the performance of thegenetic algorithm. If the scaled values vary too widely, theindividuals with the highest scaled values reproduce too rapidly,taking over the population gene pool too quickly, and preventingthe genetic algorithm from searching other areas of the solutionspace. On the other hand, if the scaled values vary only a little,all individuals have approximately the same chance of reproductionand the search will progress very slowly.
The default fitness scaling option, Rank, scales theraw scores based on the rank of each individual instead of itsscore. The rank of an individual is its position in the sortedscores: the rank of the most fit individual is 1, the next most fitis 2, and so on. The rank scaling function assigns scaled values sothat
Rank fitness scaling removes the effect of the spread of the rawscores.
The following plot shows the raw scores of a typical populationof 20 individuals, sorted in increasing order.
The following plot shows the scaled values of the raw scoresusing rank scaling.
Because the algorithm minimizes the fitness function, lower rawscores have higher scaled values. Also, because rank scalingassigns values that depend only on an individual's rank, the scaledvalues shown would be the same for any population of size 20 andnumber of parents equal to 32.
Comparing Rank and Top Scaling
To see the effect of scaling, you can compare the results of thegenetic algorithm using rank scaling with one of the other scalingoptions, such as Top. By default, top scaling assigns 40percent of the fittest individuals to the same scaled value andassigns the rest of the individuals to value 0. Using the defaultselection function, only 40 percent of the fittest individuals canbe selected as parents.
The following figure compares the scaled values of a populationof size 20 with number of parents equal to 32 using rank and topscaling.
Because top scaling restricts parents to the fittestindividuals, it creates less diverse populations than rank scaling.The following plot compares the variances of distances betweenindividuals at each generation using rank and top scaling.
Backto Top
Selection
The selection function chooses parents for the next generationbased on their scaled values from the fitness scaling function. Anindividual can be selected more than once as a parent, in whichcase it contributes its genes to more than one child. The defaultselection option, Stochastic uniform, lays out a line inwhich each parent corresponds to a section of the line of lengthproportional to its scaled value. The algorithm moves along theline in steps of equal size. At each step, the algorithm allocatesa parent from the section it lands on.
A more deterministic selection option is Remainder,which performs two steps:
Backto Top
Reproduction Options
Reproduction options control how the genetic algorithm createsthe next generation. The options are
Backto Top
Mutation and Crossover
The genetic algorithm uses the individuals in the currentgeneration to create the children that make up the next generation.Besides elite children, which correspond to the individuals in thecurrent generation with the best fitness values, the algorithmcreates
Both processes are essential to the genetic algorithm. Crossoverenables the algorithm to extract the best genes from differentindividuals and recombine them into potentially superior children.Mutation adds to the diversity of a population and therebyincreases the likelihood that the algorithm will generateindividuals with better fitness values.
See Creatingthe Next Generation for an example of how the genetic algorithmapplies mutation and crossover.
You can specify how many of each type of children the algorithmcreates as follows:
For example, if the Population size is 20, theElite count is 2, and the Crossover fractionis 0.8, the numbers of each type of children in the nextgeneration are as follows:
Backto Top
Setting the Amount of Mutation
The genetic algorithm applies mutations using the option thatyou specify on the Mutation function pane. The defaultmutation option, Gaussian, adds a random number, ormutation, chosen from a Gaussiandistribution, to each entry of the parent vector. Typically, theamount of mutation, which is proportional to the standard deviationof the distribution, decreases at each new generation. You cancontrol the average amount of mutation that the algorithm appliesto a parent in each generation through the Scale andShrink options:
You can see the effect of mutation by selecting the plot optionsDistance and Range, and then running the geneticalgorithm on a problem such as the one described in Example:Rastrigin's Function. The following figure shows the plot.
The upper plot displays the average distance between points ineach generation. As the amount of mutation decreases, so does theaverage distance between individuals, which is approximately 0 atthe final generation. The lower plot displays a vertical line ateach generation, showing the range from the smallest to the largestfitness value, as well as mean fitness value. As the amount ofmutation decreases, so does the range. These plots show thatreducing the amount of mutation decreases the diversity ofsubsequent generations.
For comparison, the following figure shows the plots forDistance and Range when you set Shrink to0.5.
With Shrink set to 0.5, the average amount ofmutation decreases by a factor of 1/2 by the final generation. As aresult, the average distance between individuals decreases byapproximately the same factor.
Backto Top
Setting the Crossover Fraction
The Crossover fraction field, in the Reproductionoptions, specifies the fraction of each population, other thanelite children, that are made up of crossover children. A crossoverfraction of 1 means that all children other than eliteindividuals are crossover children, while a crossover fraction of0 means that all children are mutation children. Thefollowing example show that neither of these extremes is aneffective strategy for optimizing a function.
The example uses the fitness function whose value at a point isthe sum of the absolute values of the coordinates at the points.That is,
You can define this function as an anonymous function by settingFitness function to
@(x) sum(abs(x))
To run the example,
Run the example with the default value of 0.8 forCrossover fraction, in the Options >Reproduction pane. This returns the best fitness value ofapproximately 0.2 and displays the following plots.
Crossover Without Mutation
To see how the genetic algorithm performs when there is nomutation, set Crossover fraction to 1.0 and clickStart. This returns the best fitness value of approximately1.3 and displays the following plots.
In this case, the algorithm selects genes from the individualsin the initial population and recombines them. The algorithm cannotcreate any new genes because there is no mutation. The algorithmgenerates the best individual that it can using these genes atgeneration number 8, where the best fitness plot becomes level.After this, it creates new copies of the best individual, which arethen are selected for the next generation. By generation number 17,all individuals in the population are the same, namely, the bestindividual. When this occurs, the average distance betweenindividuals is 0. Since the algorithm cannot improve the bestfitness value after generation 8, it stalls after 50 moregenerations, because Stall generations is set to50.
Mutation Without Crossover
To see how the genetic algorithm performs when there is nocrossover, set Crossover fraction to 0 and clickStart. This returns the best fitness value of approximately3.5 and displays the following plots.
In this case, the random changes that the algorithm appliesnever improve the fitness value of the best individual at the firstgeneration. While it improves the individual genes of otherindividuals, as you can see in the upper plot by the decrease inthe mean value of the fitness function, these improved genes arenever combined with the genes of the best individual because thereis no crossover. As a result, the best fitness plot is level andthe algorithm stalls at generation number 50.
Backto Top
Comparing Results for Varying Crossover Fractions
The demo deterministicstudy.m, which is included in thesoftware, compares the results of applying the genetic algorithm toRastrigin's function with Crossover fraction set to0, .2, .4, .6, .8, and1. The demo runs for 10 generations. At each generation,the demo plots the means and standard deviations of the bestfitness values in all the preceding generations, for each value ofthe Crossover fraction.
To run the demo, enter
deterministicstudy
at the MATLAB prompt. When the demo is finished, the plotsappear as in the following figure.
The lower plot shows the means and standard deviations of thebest fitness values over 10 generations, for each of the values ofthe crossover fraction. The upper plot shows a color-coded displayof the best fitness values in each generation.
For this fitness function, setting Crossover fraction to0.8 yields the best result. However, for another fitnessfunction, a different setting for Crossover fraction mightyield the best result.
Backto Top
Example: Global vs. Local Minima with GA
Sometimes the goal of an optimization is to find the globalminimum or maximum of a function—a point where the function valueis smaller or larger at any other point in the search space.However, optimization algorithms sometimes return a local minimum—apoint where the function value is smaller than at nearby points,but possibly greater than at a distant point in the search space.The genetic algorithm can sometimes overcome this deficiency withthe right settings.
As an example, consider the following function
The following figure shows a plot of the function.
The function has two local minima, one atx=0, where the functionvalue is –1, and the other atx=21, where the functionvalue is –1–1/e. Sincethe latter value is smaller, the global minimum occurs atx=21.
Running the Genetic Algorithm on the Example
To run the genetic algorithm on this example,
Copy and paste the following code into anew file in the MATLAB Editor.
function y = two_min(x)if x<=20 y = -exp(-(x/20).^2);else y = -exp(-1)+(x-20)*(x-22);end
Save the file as two_min.m in afolder on the MATLAB path.
In the Optimization Tool,
The genetic algorithm returns a point very close to the localminimum at x = 0.
The following custom plot shows why the algorithm finds thelocal minimum rather than the global minimum. The plot shows therange of individuals in each generation and the bestindividual.
Note that all individuals are between -2 and 2.5. While thisrange is larger than the default Initial range of[0;1], due to mutation, it is not large enough to explorepoints near the global minimum atx=21.
One way to make the genetic algorithm explore a wider range ofpoints—that is, to increase the diversity of the populations—is toincrease the Initial range. The Initial range doesnot have to include the pointx=21, but it must belarge enough so that the algorithm generates individuals nearx=21. Set Initialrange to [0;15] as shown in the following figure.
Then click Start. The genetic algorithm returns a pointvery close to 21.
This time, the custom plot shows a much wider range ofindividuals. By the second generation there are individuals greaterthan 21, and by generation 12, the algorithm finds a bestindividual that is approximately equal to 21.
Backto Top
Using a Hybrid Function
A hybrid function is an optimization function that runs afterthe genetic algorithm terminates in order to improve the value ofthe fitness function. The hybrid function uses the final point fromthe genetic algorithm as its initial point. You can specify ahybrid function in Hybrid function options.
This example uses Optimization Toolbox function fminunc,an unconstrained minimization function. The example first runs thegenetic algorithm to find a point close to the optimal point andthen uses that point as the initial point for fminunc.
The example finds the minimum of Rosenbrock's function, which isdefined by
The following figure shows a plot of Rosenbrock's function.
Global Optimization Toolbox software contains thedejong2fcn.m file, which computes Rosenbrock's function.To see a demo of this example, enter
hybriddemo
at the MATLAB prompt.
To explore the example, first enter optimtool('ga') toopen the Optimization Tool to the ga solver. Enter thefollowing settings:
Before adding a hybrid function, click Start to run thegenetic algorithm by itself. The genetic algorithm displays thefollowing results in the Run solver and view resultspane:
The final point is somewhat close to the true minimum at (1, 1).You can improve this result by setting Hybrid function tofminunc(in the Hybrid function options).
fminunc uses the final point of the genetic algorithmas its initial point. It returns a more accurate result, as shownin the Run solver and view results pane.
You can set options for the hybrid function separately from thecalling function. Use optimset (or psoptimset forthe patternsearch hybrid function) to create the optionsstructure. For example:
hybridopts = optimset('display','iter','LargeScale','off');
In the Optimization Tool enter the name of your optionsstructure in the Options box under Hybridfunction:
At the command line, the syntax is as follows:
options = gaoptimset('HybridFcn',{@fminunc,hybridopts});
hybridopts must exist before you setoptions.
Backto Top
Setting the Maximum Number of Generations
The Generations option in Stopping criteriadetermines the maximum number of generations the genetic algorithmruns for—see StoppingConditions for the Algorithm. Increasing the Generationsoption often improves the final result.
As an example, change the settings in the Optimization Tool asfollows:
Run the genetic algorithm for approximately 300 generations andclick Stop. The following figure shows the resulting bestfitness plot after 300 generations.
Note that the algorithm stalls atapproximately generation number 170—that is, there is no immediateimprovement in the fitness function after generation 170. If yourestore Stall generations to its default value of50, the algorithm would terminate at approximatelygeneration number 230. If the genetic algorithm stalls repeatedlywith the current setting for Generations, you can tryincreasing both the Generations and Stall generationsoptions to improve your results. However, changing other optionsmight be more effective.
Note When Mutationfunction is set to Gaussian, increasing the value ofGenerations might actually worsen the final result. This canoccur because the Gaussian mutation function decreases the averageamount of mutation in each generation by a factor that depends onthe value specified in Generations. Consequently, thesetting for Generations affects the behavior of thealgorithm. |
Backto Top
Vectorizing the Fitness Function
The genetic algorithm usually runs faster if youvectorize the fitness function. This meansthat the genetic algorithm only calls the fitness function once,but expects the fitness function to compute the fitness for allindividuals in the current population at once. To vectorize thefitness function,
Note The fitnessfunction, and any nonlinear constraint function, must accept anarbitrary number of rows to use the Vectorize option.ga sometimes evaluates a single row even during avectorized calculation. |
The following comparison, run at the command line, shows theimprovement in speed with Vectorize set to On.
tic;ga(@rastriginsfcn,20);tocelapsed_time = 4.3660options=gaoptimset('Vectorize','on');tic;ga(@rastriginsfcn,20,[],[],[],[],[],[],[],options);tocelapsed_time = 0.5810
If there are nonlinear constraints, the objective function andthe nonlinear constraints all need to be vectorized in order forthe algorithm to compute in a vectorized manner.
Backto Top
Constrained Minimization Using ga
Suppose you want to minimize the simple fitness function of twovariables x1 and x2,
subject to the following nonlinear inequality constraints andbounds
Begin by creating the fitness and constraint functions. First,create a file named simple_fitness.m as follows:
function y = simple_fitness(x)y = 100*(x(1)^2 - x(2))^2 + (1 - x(1))^2;
The genetic algorithm function, ga, assumes the fitnessfunction will take one input x, where x has asmany elements as the number of variables in the problem. Thefitness function computes the value of the function and returnsthat scalar value in its one return argument, y.
Then create a file, simple_constraint.m, containing theconstraints
function [c, ceq] = simple_constraint(x)c = [1.5 + x(1)*x(2) + x(1) - x(2);...-x(1)*x(2) + 10];ceq = [];
The ga function assumes the constraint function willtake one input x, where x has as many elements asthe number of variables in the problem. The constraint functioncomputes the values of all the inequality and equality constraintsand returns two vectors, c and ceq,respectively.
To minimize the fitness function, you need to pass a functionhandle to the fitness function as the first argument to thega function, as well as specifying the number of variablesas the second argument. Lower and upper bounds are provided asLB and UB respectively. In addition, you alsoneed to pass a function handle to the nonlinear constraintfunction.
ObjectiveFunction = @simple_fitness;nvars = 2; % Number of variablesLB = [0 0]; % Lower boundUB = [1 13]; % Upper boundConstraintFunction = @simple_constraint;[x,fval] = ga(ObjectiveFunction,nvars,[],[],[],[],LB,UB,ConstraintFunction)Optimization terminated: average change in the fitness valueless than options.TolFun and constraint violation is less than options.TolCon.x = 0.8122 12.3122fval = 1.3578e+004
The genetic algorithm solver handles linear constraints andbounds differently from nonlinear constraints. All the linearconstraints and bounds are satisfied throughout the optimization.However, ga may not satisfy all the nonlinear constraintsat every generation. If ga converges to a solution, thenonlinear constraints will be satisfied at that solution.
ga uses the mutation and crossover functions to producenew individuals at every generation. ga satisfies linearand bound constraints by using mutation and crossover functionsthat only generate feasible points. For example, in the previouscall to ga, the mutation functionmutationguassian does not necessarily obey the boundconstraints. So when there are bound or linear constraints, thedefault ga mutation function ismutationadaptfeasible. If you provide a custom mutationfunction, this custom function must only generate points that arefeasible with respect to the linear and bound constraints. All theincluded crossover functions generate points that satisfy thelinear constraints and bounds except thecrossoverheuristic function.
To see the progress of the optimization, use thegaoptimset function to create an options structure thatselects two plot functions. The first plot function isgaplotbestf, which plots the best and mean score of thepopulation at every generation. The second plot function isgaplotmaxconstr, which plots the maximum constraintviolation of nonlinear constraints at every generation. You canalso visualize the progress of the algorithm by displayinginformation to the command window using the 'Display'option.
options = gaoptimset('PlotFcns',{@gaplotbestf,@gaplotmaxconstr},'Display','iter');
Rerun the ga solver.
[x,fval] = ga(ObjectiveFunction,nvars,[],[],[],[],... LB,UB,ConstraintFunction,options) Best max StallGeneration f-count f(x) constraint Generations 1 849 14915.8 0 0 2 1567 13578.3 0 0 3 2334 13578.3 0 1 4 3043 13578.3 0 2 5 3752 13578.3 0 3Optimization terminated: average change in the fitness valueless than options.TolFun and constraint violation is less than options.TolCon.x = 0.8122 12.3123fval = 1.3578e+004
You can provide a start point for the minimization to thega function by specifying the InitialPopulationoption. The ga function will use the first individual fromInitialPopulation as a start point for a constrainedminimization.
X0 = [0.5 0.5]; % Start point (row vector)options = gaoptimset(options,'InitialPopulation',X0);
Now, rerun the ga solver.
[x,fval] = ga(ObjectiveFunction,nvars,[],[],[],[],... LB,UB,ConstraintFunction,options) Best max StallGeneration f-count f(x) constraint Generations 1 965 13579.6 0 0 2 1728 13578.2 1.776e-015 0 3 2422 13578.2 0 0Optimization terminated: average change in the fitness valueless than options.TolFun and constraint violation is less than options.TolCon.x = 0.8122 12.3122fval = 1.3578e+004
Vectorized Constraints
If there are nonlinear constraints, the objective function andthe nonlinear constraints all need to be vectorized in order forthe algorithm to compute in a vectorized manner.
Vectorizingthe Objective and Constraint Functions contains an example ofhow to vectorize both for the solver patternsearch. Thesyntax is nearly identical for ga. The only difference isthat patternsearch can have its patterns appear as eitherrow or column vectors; the corresponding vectors for gaare the population vectors, which are always rows.
Backto Top