<?xml version="1.0" encoding="UTF-8"?><!DOCTYPE article  PUBLIC "-//NLM//DTD Journal Publishing DTD v3.0 20080202//EN" "http://dtd.nlm.nih.gov/publishing/3.0/journalpublishing3.dtd"><article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" dtd-version="3.0" xml:lang="en" article-type="research article"><front><journal-meta><journal-id journal-id-type="publisher-id">OJS</journal-id><journal-title-group><journal-title>Open Journal of Statistics</journal-title></journal-title-group><issn pub-type="epub">2161-718X</issn><publisher><publisher-name>Scientific Research Publishing</publisher-name></publisher></journal-meta><article-meta><article-id pub-id-type="doi">10.4236/ojs.2016.61016</article-id><article-id pub-id-type="publisher-id">OJS-63884</article-id><article-categories><subj-group subj-group-type="heading"><subject>Articles</subject></subj-group><subj-group subj-group-type="Discipline-v2"><subject>Physics&amp;Mathematics</subject></subj-group></article-categories><title-group><article-title>
 
 
  The Dual of the Maximum Likelihood
 
</article-title></title-group><contrib-group><contrib contrib-type="author" xlink:type="simple"><name name-style="western"><surname>uirino</surname><given-names>Paris</given-names></name><xref ref-type="aff" rid="aff1"><sub>1</sub></xref><xref ref-type="corresp" rid="cor1"><sup>*</sup></xref></contrib></contrib-group><aff id="aff1"><label>1</label><addr-line>Department of Agricultural and Resource Economics, University of California, Davis, CA, USA</addr-line></aff><author-notes><corresp id="cor1">* E-mail:<email>paris@primal.ucdavis.edu</email></corresp></author-notes><pub-date pub-type="epub"><day>03</day><month>02</month><year>2016</year></pub-date><volume>06</volume><issue>01</issue><fpage>186</fpage><lpage>193</lpage><history><date date-type="received"><day>31</day>	<month>January</month>	<year>2016</year></date><date date-type="rev-recd"><day>accepted</day>	<month>23</month>	<year>February</year>	</date><date date-type="accepted"><day>26</day>	<month>February</month>	<year>2016</year></date></history><permissions><copyright-statement>&#169; Copyright  2014 by authors and Scientific Research Publishing Inc. </copyright-statement><copyright-year>2014</copyright-year><license><license-p>This work is licensed under the Creative Commons Attribution International License (CC BY). http://creativecommons.org/licenses/by/4.0/</license-p></license></permissions><abstract><p>
 
 
  The Maximum Likelihood method estimates the parameter values of a statistical model that maximizes the corresponding likelihood function, given the sample information. This is the primal approach that, in this paper, is presented as a mathematical programming specification whose solution requires the formulation of a Lagrange problem. A result of this setup is that the Lagrange multipliers associated with the linear statistical model (where sample observations are regarded as a set of constraints) are equal to the vector of residuals scaled by the variance of those residuals. The novel contribution of this paper consists in deriving the dual model of the Maximum Likelihood method under normality assumptions. This model minimizes a function of the variance of the error terms subject to orthogonality conditions between the model residuals and the space of explanatory variables. An intuitive interpretation of the dual problem appeals to basic elements of information theory and an economic interpretation of Lagrange multipliers to establish that the dual maximizes the net value of the sample information. This paper presents the dual ML model for a single regression and provides a numerical example of how to obtain maximum likelihood estimates of the parameters of a linear statistical model using the dual specification.
 
</p></abstract><kwd-group><kwd>Maximum Likelihood</kwd><kwd> Primal</kwd><kwd> Dual</kwd><kwd> Signal</kwd><kwd> Noise</kwd><kwd> Value of Sample Information</kwd></kwd-group></article-meta></front><body><sec id="s1"><title>1. Introduction</title><p>In general, to any problem stated in the form of a maximization (minimization) criterion, there corresponds a dual specification in the form of a minimization (maximization) goal. This structure applies also to the Maximum Likelihood (ML) approach, one of the most widely used statistical methodologies. A clear description of the Maximum Likelihood principle is found in Kmenta ([<xref ref-type="bibr" rid="scirp.63884-ref1">1</xref>] , p. 175-180): Given a sample of observations about random variables, “the question is: To which population does the sample most likely belong?” The answer can be found by defining ( [<xref ref-type="bibr" rid="scirp.63884-ref2">2</xref>] p. 396) “the likelihood function as the joint probability distribution of the data, treated as a function of the unknown coefficients. The Maximum Likelihood (ML) estimator of the unknown coefficients consists of the values of the coefficients that maximize the likelihood function”. That is, the maximum likelihood estimator selects the parameter values that give the observed data sample the largest possible probability of having been drawn from a population defined by the estimated coefficients.</p><p>In this paper, we concentrate on data samples drawn from normal populations and deal with linear statistical models. The ML approach, then, estimates the mean and variance of that normal population that will maximizes the likelihood function given the sample information.</p><p>The novel contribution of the paper consists in developing the dual specification of the Maximum Likelihood method (under normality) and in giving it an intuitive economic interpretation that corresponds to the maximization of the net value of sample information. The dual specification is of interest because it exists and integrates the knowledge of the Maximum Likelihood methodology. In fact, to any maximization problem subject to linear constraints there corresponds a minimization problem subject to some other type of constraints. The two specifications are equivalent in the sense that they provide identical values of the parameter estimates, but the two paths to achieve those estimates are significantly different. In mathematics, there are many examples of how the same objective (solutions and parameter estimates) may be achieved by different methods. This abundance of approaches has often inspired further discoveries.</p><p>The notion of the dual specification of a statistical problem is not likely familiar to a wide audience of statisticians in spite of the fact that a large body of statistical methodology relies explicitly on the maximization of a likelihood function and the minimization of a least-squares function. These optimizations are simply primal versions of statistical problems. Yet, the dual specification of the same problems exists in an analogous manner as the back face of the moon exists and was unknown until a spacecraft circumnavigated that celestial body. Hence, the dual specification of the ML methodology enriches the statistician’s toolkit and understanding of the methodology.</p><p>The analytical framework that ties together the primal and dual specifications of a ML problem is the Lagrange function. This important function has the shape of a saddle where the equilibrium point is achieved by maximizing it in one direction (operating with primal variables: parameters and errors) and minimizing it in the other, orthogonal direction (operating with dual variables: Lagrange multipliers). Lagrange multipliers and their meaning, therefore, are crucial elements for understanding the structure of a dual specification.</p><p>In Section 2, we present the traditional Maximum Likelihood method in the form of a primal nonlinear programming model. This specification is a natural step toward the discovery of the dual structure of the ML method. It differs from the traditional way to present the ML approach because the model equations (representing the sample information) are not substituted into the likelihood function but are stated as constraints. Hence, the estimates of parameters and errors of the model are computed simultaneously rather than sequentially. A remarkable result of this setup is that the Lagrange multipliers associated with the linear statistical model (regarded as a set of constraints defined by the sample observations) are revealed to be equal to the residuals scaled by the variance of the error terms. Section 3 derives and interprets the dual specification of the ML method. The intuitive interpretation of the dual problem appeals to basic elements of information theory and economics. The economic interpretation of Lagrange multipliers as shadow prices (prices that cannot be seen on the market) establishes that the dual ML minimizes the negative net value of sample information (NVSI), which is equivalent to maximize the NVSI. This economic notion will be clarified further in subsequent sections. Although the numerical solution of the dual ML method is feasible (as illustrated with a non trivial numerical example in Section 4), the main goal of this paper is to present and interpret the dual specification of the ML method as integrating the understanding of the double role played by residuals. It turns out that residual errors in the primal model assume the intuitive role of noise while, in the dual model, they assume the intuitive role of penalty parameters (for violating the constraints) that in economics correspond to the notion of “marginal sacrifices” or “shadow prices”. The name “marginal sacrifices” indicates that the size of Lagrange multipliers represents a measure of constraint’s tightness (in this case, of the observations in the linear model) and influences directly the value of the likelihood function. In other words, the optimal value of the likelihood function varies precisely according to the value of the Lagrange multiplier per unit increase (or decrease) in the level of the associated observation. In this sense, in economics it is customary to refer to the level of a Lagrange multiplier as the “marginal sacrifice” (or “shadow price”) due to the active presence of the corresponding constraint that limits the increase or decrease of the objective function.</p><p>Section 4 presents a numerical example where the parameters of a linear statistical model are estimated using the dual ML specification. The sample size is composed of 114 observations and there are six parameters to be estimated. A peculiar feature of this dual estimation is given by the analytical expression of the model variance that must use a rarely recognized formula.</p></sec><sec id="s2"><title>2. The Primal of the Maximum Likelihood Method (Normal Distribution)</title><p>We consider a linear statistical model</p><disp-formula id="scirp.63884-formula807"><label>(1)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x6.png"  xlink:type="simple"/></disp-formula><p>where <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x7.png" xlink:type="simple"/></inline-formula> is an <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x8.png" xlink:type="simple"/></inline-formula> vector of sample data (observations), <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x9.png" xlink:type="simple"/></inline-formula>is an <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x10.png" xlink:type="simple"/></inline-formula> matrix of predetermined values of explanatory variables, <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x11.png" xlink:type="simple"/></inline-formula>is a <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x12.png" xlink:type="simple"/></inline-formula>vector of unknown parameters, and <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x13.png" xlink:type="simple"/></inline-formula> is an <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x14.png" xlink:type="simple"/></inline-formula>vector of random errors that are assumed to be independently and normally distributed as<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x15.png" xlink:type="simple"/></inline-formula>. The vector of observations <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x16.png" xlink:type="simple"/></inline-formula> is known also as the sample information. Then, ML estimates of the parameters and errors can be obtained by maximizing the logarithm of the corresponding likelihood function with respect to <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x17.png" xlink:type="simple"/></inline-formula> subject to the constraints represented by model (1). We call this specification the primal ML method. Specifically,</p><disp-formula id="scirp.63884-formula808"><label>(2)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x18.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula809"><label>(3)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x19.png"  xlink:type="simple"/></disp-formula><p>Traditionally, constraints (3) are substituted for the error vector <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x20.png" xlink:type="simple"/></inline-formula> in the likelihood function (2) and an unconstrained maximization calculation will follow. This algebraic manipulation, however, obscures the path toward a dual specification of the problem under study. Therefore, we maintain the structure of the ML model as in relations (2) and (3) and proceed to state the corresponding Lagrange function by selecting a <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x20.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x21.png" xlink:type="simple"/></inline-formula> vector <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x20.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x21.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x22.png" xlink:type="simple"/></inline-formula> of Lagrange multipliers of constraints (3) to obtain</p><disp-formula id="scirp.63884-formula810"><label>(4)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x23.png"  xlink:type="simple"/></disp-formula><p>The maximization of the Lagrange function (4) requires taking partial derivatives of <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x24.png" xlink:type="simple"/></inline-formula> with respect to<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x24.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x25.png" xlink:type="simple"/></inline-formula>, setting them equal to zero, in which case we signify that a solution of the resulting equations is an estimate of the corresponding parameters and errors:</p><disp-formula id="scirp.63884-formula811"><label>(5)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x26.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula812"><label>(6)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x27.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula813"><label>(7)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x28.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula814"><label>(8)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x29.png"  xlink:type="simple"/></disp-formula><p>First order necessary conditions (FONC) are given by equating (5)-(8) to zero and signifying by “^” that the resulting solution values are ML estimates. We obtain</p><disp-formula id="scirp.63884-formula815"><label>(9)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x30.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula816"><label>(10)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x31.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula817"><label>(11)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x32.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula818"><label>(12)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x33.png"  xlink:type="simple"/></disp-formula><p>where<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x34.png" xlink:type="simple"/></inline-formula>. Lagrange multipliers <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x34.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x35.png" xlink:type="simple"/></inline-formula> are defined in terms of the estimates of primal variables <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x34.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x35.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x36.png" xlink:type="simple"/></inline-formula> and<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x34.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x35.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x36.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x37.png" xlink:type="simple"/></inline-formula>. Indeed, the vector of Lagrange multipliers <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x34.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x35.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x36.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x37.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x38.png" xlink:type="simple"/></inline-formula> is equal to <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x34.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x35.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x36.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x37.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x38.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x39.png" xlink:type="simple"/></inline-formula> up to a scalar<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x34.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x35.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x36.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x37.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x38.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x39.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x40.png" xlink:type="simple"/></inline-formula>. This equality simplifies the statement of the corresponding dual problem because we can dispense with the symbol<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x34.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x35.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x36.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x37.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x38.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x39.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x40.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x41.png" xlink:type="simple"/></inline-formula>. In fact, using Equation (9), Equation (10) can be restated equivalently as the following orthogonality condition between the residuals of model (1) and the space of predetermined variables</p><disp-formula id="scirp.63884-formula819"><label>(13)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x42.png"  xlink:type="simple"/></disp-formula><p>Relation (9) is an example of self-duality where a vector of dual variables is equal (up to a scalar) to a vector of primal variables.</p><p>The value of a Lagrange multiplier is a measure of tightness of the corresponding constraint. In other words, the optimal value of the likelihood function would be modified in the amount equal to the value of the Lagrange multiplier per unit value increase (or decrease) in the level of the corresponding sample observation. The meaning of a Lagrange multiplier (or dual variable) is that of a penalty imposed on a violation of the corresponding constraint. Hence, in the economic terminology, a synonymous meaning of the Lagrange multiplier is that of a “marginal sacrifice” (or “shadow price”) associated with the presence of a tight constraint (observation) that prevents the objective function to achieve a higher (or lower) level. Within the context of this paper, the vector of optimal values of the Lagrange multipliers <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x43.png" xlink:type="simple"/></inline-formula> is identically equal to the vector of residuals <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x43.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x44.png" xlink:type="simple"/></inline-formula> scaled by<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x43.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x44.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x45.png" xlink:type="simple"/></inline-formula>. Therefore, the symbol <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x43.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x44.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x45.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x46.png" xlink:type="simple"/></inline-formula> takes on a double role: as a vector of residuals (or noise) in the primal ML problem [(2)-(3)] and―when scaled by<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x43.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x44.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x45.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x46.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x47.png" xlink:type="simple"/></inline-formula>―as a vector of “shadow prices” in the dual ML problem (defined in the next section).</p><p>Furthermore, note that the multiplication of Equation (12) by the vector <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x48.png" xlink:type="simple"/></inline-formula> (replaced by its equivalent representation given in Equation (9)) and the use of the orthogonality condition (13) produces</p><disp-formula id="scirp.63884-formula820"><label>(14a)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x49.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula821"><label>(14b)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x50.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula822"><label>(14c)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x51.png"  xlink:type="simple"/></disp-formula><p>This means that a ML estimate of <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x52.png" xlink:type="simple"/></inline-formula> can be obtained in two different but equivalent ways as</p><disp-formula id="scirp.63884-formula823"><label>(15)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x53.png"  xlink:type="simple"/></disp-formula><p>Relation (15), as explained in sections 3 and 4, is of paramount importance for stating an operational specification of the dual ML model because the presence of the vector of sample observations <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x54.png" xlink:type="simple"/></inline-formula> in the estimate of <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x54.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x55.png" xlink:type="simple"/></inline-formula> is the only place where this sample information appears in the dual ML model. This assertion will be clarified further in Sections 3 and 4.</p></sec><sec id="s3"><title>3. The Dual of the Maximum Likelihood Method</title><p>The statement of the dual specification of the ML approach follows the general rules of duality theory in mathematical programming. The dual objective function, to be minimized with respect to the Lagrange multipliers, is the Lagrange function stated in (4) with the appropriate simplifications allowed by its algebraic structure and the information derived from FONCs (9) and (10). The dual constraints are all the FONCs that are different from the primal constraints. This leads to the following specification:</p><disp-formula id="scirp.63884-formula824"><label>(16)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x56.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula825"><label>(17)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x57.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula826"><label>(18)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x58.png"  xlink:type="simple"/></disp-formula><p>The second expression of the dual ML function (16) follows from the fact that <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x59.png" xlink:type="simple"/></inline-formula> (from (15)). The solution of the dual ML problem [(16)-(18)] by any reliable nonlinear programming software (e.g., GAMS, [<xref ref-type="bibr" rid="scirp.63884-ref3">3</xref>] ) produces estimates of <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x59.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x60.png" xlink:type="simple"/></inline-formula> that are identical to those obtained from the primal ML problem [(2)-(3)]. Observe that the vector of sample information <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x59.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x60.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x61.png" xlink:type="simple"/></inline-formula> appears only in Equation (18). This fact justifies the computation of the model variance <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x59.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x60.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x61.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x62.png" xlink:type="simple"/></inline-formula> by means of the rarely used formula (18). The dual ML estimate of the parameter vector <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x59.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x60.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x61.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x62.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x63.png" xlink:type="simple"/></inline-formula> is obtained as a vector of Lagrange multipliers associated to the orthogonality constraints (17). The terms in the square bracket of the dual objective function (16) express a quantity that has intuitive appeal, that is</p><disp-formula id="scirp.63884-formula827"><label>(19)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x64.png"  xlink:type="simple"/></disp-formula><p>Using the terminology of information theory, the expression in the right-hand-side of Equation (19) can be given the following interpretation. The linear statistical model (1) can be regarded as the decomposition of a message into a signal and noise, that is</p><p>message = signal + noise (20)</p><p>where message<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x65.png" xlink:type="simple"/></inline-formula>, signal<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x65.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x66.png" xlink:type="simple"/></inline-formula>, and noise<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x65.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x66.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x67.png" xlink:type="simple"/></inline-formula>. Let us recall that <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x65.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x66.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x67.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x68.png" xlink:type="simple"/></inline-formula> is the “shadow price” of the constraints of model (1) (defined by (2)) and <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x65.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x66.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x67.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x68.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x69.png" xlink:type="simple"/></inline-formula> is the quantity of sample information. Hence, the inner product <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x65.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x66.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x67.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x68.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x69.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x70.png" xlink:type="simple"/></inline-formula> can be regarded as the gross value (price times quantity) of the sample information while the expression <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x65.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x66.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x67.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x68.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x69.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x70.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x71.png" xlink:type="simple"/></inline-formula> (again, shadow price times quantity) can be regarded as a cost function of noise. In summary, therefore, the relation <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x65.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x66.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x67.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x68.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x69.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x70.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x71.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x72.png" xlink:type="simple"/></inline-formula> can be interpreted as the net value of sample information (NVSI) that is maximized in the dual objective function because of the negative sign in front of it.</p><p>A simplified (for graphical reasons) presentation of the primal and dual objective functions (2) and (16) of the ML problem (subject to constraints) may provide some intuition about the shape of those functions. <xref ref-type="fig" rid="fig1">Figure 1</xref> is the antilog of the primal likelihood function (2) (subject to constraint (3)) of a sample of two random variables <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x73.png" xlink:type="simple"/></inline-formula> with mean 0 and variance 0.16. The sample information is<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x73.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x74.png" xlink:type="simple"/></inline-formula>. For graphical reasons, the variance of the error terms is kept fixed at<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x73.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x74.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x75.png" xlink:type="simple"/></inline-formula>. The explanatory variables are taken as<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x73.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x74.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x75.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x76.png" xlink:type="simple"/></inline-formula>. There are two parameters to estimate: <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x73.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x74.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x75.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x76.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x77.png" xlink:type="simple"/></inline-formula>and<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x73.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x74.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x75.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x76.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x77.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x78.png" xlink:type="simple"/></inline-formula>. The ML estimates of these parameters are <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x73.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x74.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x75.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x76.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x77.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x78.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x79.png" xlink:type="simple"/></inline-formula> and <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x73.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x74.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x75.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x76.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x77.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x78.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x79.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x80.png" xlink:type="simple"/></inline-formula> with <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x73.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x74.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x75.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x76.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x77.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x78.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x79.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x80.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x81.png" xlink:type="simple"/></inline-formula> and <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x73.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x74.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x75.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x76.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x77.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x78.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x79.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x80.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x81.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x82.png" xlink:type="simple"/></inline-formula></p><p><xref ref-type="fig" rid="fig1">Figure 1</xref> presents the expected shape of the primal likelihood function (subject to constraints) that justifies the maximization objective.</p><p><xref ref-type="fig" rid="fig2">Figure 2</xref> is the antilog of the dual objective function (16) (subject to constraint (17)) for the same values of variables Y and X. <xref ref-type="fig" rid="fig2">Figure 2</xref> shows a convex function of the error terms with <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x83.png" xlink:type="simple"/></inline-formula> and <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x83.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x84.png" xlink:type="simple"/></inline-formula> being the minimizing values of the dual objective function.</p><p>It is well known that, under the assumptions of model (1), least-squares estimates of the parameters of model (1) are also maximum likelihood estimates. Hence, <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x85.png" xlink:type="simple"/></inline-formula>is the primal objective function (to be minimized) of the least-squares method while <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x85.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x86.png" xlink:type="simple"/></inline-formula> corresponds to the dual objective function (to be maximized) of the least-squares approach as in [<xref ref-type="bibr" rid="scirp.63884-ref4">4</xref>] . Optimal solutions of primal and dual least-squares problems result in</p><disp-formula id="scirp.63884-formula828"><label>(21)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x87.png"  xlink:type="simple"/></disp-formula><p>where LS stands for Least Squares. The role as a vector of “shadow prices” taken on by <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x88.png" xlink:type="simple"/></inline-formula> is justified also by the derivative of the Least-Squares function with respect to the quantity vector of sample information y. In this case,</p><disp-formula id="scirp.63884-formula829"><graphic  xlink:href="http://html.scirp.org/file/16-1240654x89.png"  xlink:type="simple"/></disp-formula><fig-group id="fig1"><label><xref ref-type="fig" rid="fig1">Figure 1</xref></label><caption><title> Representation of the antilog of the primal objective function (2).</title></caption><fig id ="fig1_1"><label></label><graphic mimetype="image"   position="float"  xlink:type="simple"  xlink:href="http://html.scirp.org/file/16-1240654x90.png"/></fig></fig-group><fig id="fig2"  position="float"><label><xref ref-type="fig" rid="fig2">Figure 2</xref></label><caption><title> Representation of the antilog of the dual objective function (16)</title></caption><graphic mimetype="image"   position="float"  xlink:type="simple"  xlink:href="http://html.scirp.org/file/16-1240654x91.png"/></fig><p>shows that the change in the Least-Squares objective function due to an infinitesimal change in the level of the quantity vector of constraints <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x92.png" xlink:type="simple"/></inline-formula> (sample information in model (1)) corresponds to the Lagrange multiplier vector <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x92.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x93.png" xlink:type="simple"/></inline-formula> that, in the LS case, is equal to the vector of residuals<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x92.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x93.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x94.png" xlink:type="simple"/></inline-formula>.</p></sec><sec id="s4"><title>4. A Numerical Example: Parameter Estimates by Primal and Dual ML</title><p>In this section we consider the estimation of a production function using the results of a famous experimental trial on corn conducted in Iowa by Heady, Pesek and Brown [<xref ref-type="bibr" rid="scirp.63884-ref5">5</xref>] . There are n = 114 observations on phosphorus and nitrogen levels allocated to corn experimental plots according to an incomplete factorial design. The sample data are given in <xref ref-type="table" rid="table">Table </xref>A1 in Appendix. The production function of interest takes the form of a second order polynomial relation involving phosphorus and nitrogen such as</p><disp-formula id="scirp.63884-formula830"><label>(22)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x95.png"  xlink:type="simple"/></disp-formula><p>where <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x96.png" xlink:type="simple"/></inline-formula> corn, <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x96.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x97.png" xlink:type="simple"/></inline-formula>phosphorus and <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x96.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x97.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x98.png" xlink:type="simple"/></inline-formula> nitrogen.</p><p>This polynomial regression was estimated twice: using the primal ML specification given in (2) and (3) and, then, using the dual ML version given in (16), (17) and (18). The results are identical and are reported (only once) in <xref ref-type="table" rid="table">Table </xref>1.</p><p>Specifically, the estimated dual ML model takes on the following structure:</p><table-wrap id="table1" ><label><xref ref-type="table" rid="table">Table </xref>1</label><caption><title> ML estimates using the dual specification</title></caption><table><tbody><thead><tr><th align="center" valign="middle" >Parameters</th><th align="center" valign="middle" >Estimates</th></tr></thead><tr><td align="center" valign="middle" ><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x99.png" xlink:type="simple"/></inline-formula></td><td align="center" valign="middle" >−0.0751090</td></tr><tr><td align="center" valign="middle" ><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x100.png" xlink:type="simple"/></inline-formula></td><td align="center" valign="middle" >0.6638066</td></tr><tr><td align="center" valign="middle" ><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x101.png" xlink:type="simple"/></inline-formula></td><td align="center" valign="middle" >0.5840521</td></tr><tr><td align="center" valign="middle" ><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x102.png" xlink:type="simple"/></inline-formula></td><td align="center" valign="middle" >−0.1795839</td></tr><tr><td align="center" valign="middle" ><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x103.png" xlink:type="simple"/></inline-formula></td><td align="center" valign="middle" >−0.1579967</td></tr><tr><td align="center" valign="middle" ><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x104.png" xlink:type="simple"/></inline-formula></td><td align="center" valign="middle" >0.0809632</td></tr><tr><td align="center" valign="middle" ><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x105.png" xlink:type="simple"/></inline-formula></td><td align="center" valign="middle" >0.0357006</td></tr><tr><td align="center" valign="middle" >Log Likelihood</td><td align="center" valign="middle" >28.1985686</td></tr></tbody></table></table-wrap><disp-formula id="scirp.63884-formula831"><label>(23)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x106.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula832"><label>(24)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x107.png"  xlink:type="simple"/></disp-formula><disp-formula id="scirp.63884-formula833"><label>(25)</label><graphic position="anchor" xlink:href="http://html.scirp.org/file/16-1240654x108.png"  xlink:type="simple"/></disp-formula><p>The estimation was performed using GAMS [<xref ref-type="bibr" rid="scirp.63884-ref3">3</xref>] . Observe that the sample information <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x109.png" xlink:type="simple"/></inline-formula> appears only in Equation (25) that estimates the model variance<inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x109.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x110.png" xlink:type="simple"/></inline-formula>. The estimates of parameters <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x109.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x110.png" xlink:type="simple"/></inline-formula><inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x111.png" xlink:type="simple"/></inline-formula> will appear in the Lagrange function as Lagrange multipliers of constraints (24).</p></sec><sec id="s5"><title>5. Conclusions</title><p>A statistical model may be regarded as the decomposition of a sample of messages into signals and noise. When the noise is distributed according to a normal density, the ML method maximizes the probability that the sample belongs to a given normal population defined by the ML estimates of the model’s parameters. All this is well known. This paper has analyzed and interpreted the dual of the ML method under normal assumptions.</p><p>It turns out that the dual objective function is a convex function of noise. Hence, a convenient interpretation is that the dual of the ML method minimizes a cost function of noise. This cost function is defined by the sample variance of noise. Equivalently, the dual ML method maximizes the net value of the sample information. The choice of an economic terminology for interpreting the dual ML method is justified by the double role played by the symbol <inline-formula><inline-graphic xlink:href="http://html.scirp.org/file/16-1240654x112.png" xlink:type="simple"/></inline-formula> representing the vector of residual terms of the estimated statistical model. This symbol may be interpreted as a vector of noises in the primal specification and a vector of “shadow prices” in the dual model because it corresponds identically (up to a scalar) to the vector of Lagrange multipliers of the sample observations.</p><p>The numerical implementation of the dual ML method brings to the fore, by necessity, a neglected definition of the sample variance. In the ML estimation of the model’s parameters, it is necessary to state the definition of the variance as the inner product of the sample information and the residuals (noise), as revealed by Equations (15) and (25), because it is the only place where the sample information appears in the dual model. The dual approach to ML provides an alternative path to the desired ML estimates and, therefore, augments the statistician’s toolkit. It integrates our knowledge of what exists on the other side of the ML fence, a territory that has revealed interesting vistas on old statistical landscapes.</p></sec><sec id="s6"><title>Cite this paper</title><p>QuirinoParis, (2016) The Dual of the Maximum Likelihood. Open Journal of Statistics,06,186-193. doi: 10.4236/ojs.2016.61016</p></sec><sec id="s7"><title>Appendix</title><table-wrap id="table2" ><label><xref ref-type="table" rid="table">Table </xref>A1</label><caption><title> Sample data from the Iowa corn experiment [<xref ref-type="bibr" rid="scirp.63884-ref5">5</xref>] </title></caption><table><tbody><thead><tr><th align="center" valign="middle" >Observ.</th><th align="center" valign="middle" >Corn</th><th align="center" valign="middle" >P</th><th align="center" valign="middle" >N</th><th align="center" valign="middle" >Observ.</th><th align="center" valign="middle" >Corn</th><th align="center" valign="middle" >P</th><th align="center" valign="middle" >N</th><th align="center" valign="middle" >Observ.</th><th align="center" valign="middle" >Corn</th><th align="center" valign="middle" >P</th><th align="center" valign="middle" >N</th></tr></thead><tr><td align="center" valign="middle" >1</td><td align="center" valign="middle" >0.245</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >39</td><td align="center" valign="middle" >0.251</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >77</td><td align="center" valign="middle" >0.162</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >2.4</td></tr><tr><td align="center" valign="middle" >2</td><td align="center" valign="middle" >0.062</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >40</td><td align="center" valign="middle" >0.245</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >78</td><td align="center" valign="middle" >0.068</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >2.4</td></tr><tr><td align="center" valign="middle" >3</td><td align="center" valign="middle" >0.267</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >41</td><td align="center" valign="middle" >1.194</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >79</td><td align="center" valign="middle" >1.124</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >2.4</td></tr><tr><td align="center" valign="middle" >4</td><td align="center" valign="middle" >0.296</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >42</td><td align="center" valign="middle" >0.973</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >80</td><td align="center" valign="middle" >1.256</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >2.4</td></tr><tr><td align="center" valign="middle" >5</td><td align="center" valign="middle" >0.221</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >43</td><td align="center" valign="middle" >1.333</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >81</td><td align="center" valign="middle" >1.305</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >2.4</td></tr><tr><td align="center" valign="middle" >6</td><td align="center" valign="middle" >0.306</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >44</td><td align="center" valign="middle" >1.244</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >82</td><td align="center" valign="middle" >1.243</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >2.4</td></tr><tr><td align="center" valign="middle" >7</td><td align="center" valign="middle" >0.442</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >45</td><td align="center" valign="middle" >1.295</td><td align="center" valign="middle" >2.8</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >83</td><td align="center" valign="middle" >1.211</td><td align="center" valign="middle" >2.4</td><td align="center" valign="middle" >2.4</td></tr><tr><td align="center" valign="middle" >8</td><td align="center" valign="middle" >0.219</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >46</td><td align="center" valign="middle" >1.252</td><td align="center" valign="middle" >2.8</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >84</td><td align="center" valign="middle" >1.142</td><td align="center" valign="middle" >2.4</td><td align="center" valign="middle" >2.4</td></tr><tr><td align="center" valign="middle" >9</td><td align="center" valign="middle" >0.120</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >47</td><td align="center" valign="middle" >1.357</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >85</td><td align="center" valign="middle" >1.273</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >2.4</td></tr><tr><td align="center" valign="middle" >10</td><td align="center" valign="middle" >0.340</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >48</td><td align="center" valign="middle" >1.215</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >86</td><td align="center" valign="middle" >1.395</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >2.4</td></tr><tr><td align="center" valign="middle" >11</td><td align="center" valign="middle" >0.377</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >49</td><td align="center" valign="middle" >0.173</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >87</td><td align="center" valign="middle" >0.268</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >2.8</td></tr><tr><td align="center" valign="middle" >12</td><td align="center" valign="middle" >0.342</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >50</td><td align="center" valign="middle" >0.042</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >88</td><td align="center" valign="middle" >0.077</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >2.8</td></tr><tr><td align="center" valign="middle" >13</td><td align="center" valign="middle" >0.380</td><td align="center" valign="middle" >2.4</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >51</td><td align="center" valign="middle" >0.960</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >89</td><td align="center" valign="middle" >1.149</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >2.8</td></tr><tr><td align="center" valign="middle" >14</td><td align="center" valign="middle" >0.350</td><td align="center" valign="middle" >2.4</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >52</td><td align="center" valign="middle" >1.070</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >90</td><td align="center" valign="middle" >1.292</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >2.8</td></tr><tr><td align="center" valign="middle" >15</td><td align="center" valign="middle" >0.324</td><td align="center" valign="middle" >2.8</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >53</td><td align="center" valign="middle" >1.159</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >91</td><td align="center" valign="middle" >1.236</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >2.8</td></tr><tr><td align="center" valign="middle" >16</td><td align="center" valign="middle" >0.274</td><td align="center" valign="middle" >2.8</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >54</td><td align="center" valign="middle" >0.726</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >92</td><td align="center" valign="middle" >1.425</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >2.8</td></tr><tr><td align="center" valign="middle" >17</td><td align="center" valign="middle" >0.063</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >55</td><td align="center" valign="middle" >1.136</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >93</td><td align="center" valign="middle" >1.300</td><td align="center" valign="middle" >2.8</td><td align="center" valign="middle" >2.8</td></tr><tr><td align="center" valign="middle" >18</td><td align="center" valign="middle" >0.179</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >56</td><td align="center" valign="middle" >1.021</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >94</td><td align="center" valign="middle" >1.419</td><td align="center" valign="middle" >2.8</td><td align="center" valign="middle" >2.8</td></tr><tr><td align="center" valign="middle" >19</td><td align="center" valign="middle" >0.239</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >57</td><td align="center" valign="middle" >1.297</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >95</td><td align="center" valign="middle" >1.318</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >2.8</td></tr><tr><td align="center" valign="middle" >20</td><td align="center" valign="middle" >0.118</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >58</td><td align="center" valign="middle" >1.163</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >96</td><td align="center" valign="middle" >1.119</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >2.8</td></tr><tr><td align="center" valign="middle" >21</td><td align="center" valign="middle" >0.602</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >59</td><td align="center" valign="middle" >1.287</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >97</td><td align="center" valign="middle" >0.251</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >22</td><td align="center" valign="middle" >0.825</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >60</td><td align="center" valign="middle" >1.093</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >98</td><td align="center" valign="middle" >0.190</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >23</td><td align="center" valign="middle" >0.962</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >61</td><td align="center" valign="middle" >1.276</td><td align="center" valign="middle" >2.4</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >99</td><td align="center" valign="middle" >0.819</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >24</td><td align="center" valign="middle" >0.807</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >62</td><td align="center" valign="middle" >1.258</td><td align="center" valign="middle" >2.4</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >100</td><td align="center" valign="middle" >0.764</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >25</td><td align="center" valign="middle" >0.811</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >63</td><td align="center" valign="middle" >1.344</td><td align="center" valign="middle" >2.8</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >101</td><td align="center" valign="middle" >1.290</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >26</td><td align="center" valign="middle" >0.510</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >64</td><td align="center" valign="middle" >1.276</td><td align="center" valign="middle" >2.8</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >102</td><td align="center" valign="middle" >0.820</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >27</td><td align="center" valign="middle" >0.795</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >65</td><td align="center" valign="middle" >1.229</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >103</td><td align="center" valign="middle" >1.246</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >28</td><td align="center" valign="middle" >0.397</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >66</td><td align="center" valign="middle" >1.227</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >104</td><td align="center" valign="middle" >0.830</td><td align="center" valign="middle" >1.2</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >29</td><td align="center" valign="middle" >0.287</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >67</td><td align="center" valign="middle" >0.073</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >105</td><td align="center" valign="middle" >1.356</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >30</td><td align="center" valign="middle" >0.064</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >68</td><td align="center" valign="middle" >0.100</td><td align="center" valign="middle" >0.0</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >106</td><td align="center" valign="middle" >1.227</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >31</td><td align="center" valign="middle" >0.995</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >69</td><td align="center" valign="middle" >0.954</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >107</td><td align="center" valign="middle" >1.360</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >32</td><td align="center" valign="middle" >1.154</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >70</td><td align="center" valign="middle" >0.954</td><td align="center" valign="middle" >0.4</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >108</td><td align="center" valign="middle" >1.182</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >33</td><td align="center" valign="middle" >1.022</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >71</td><td align="center" valign="middle" >1.057</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >109</td><td align="center" valign="middle" >1.309</td><td align="center" valign="middle" >2.4</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >34</td><td align="center" valign="middle" >1.085</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >72</td><td align="center" valign="middle" >1.155</td><td align="center" valign="middle" >1.6</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >110</td><td align="center" valign="middle" >1.449</td><td align="center" valign="middle" >2.4</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >35</td><td align="center" valign="middle" >0.972</td><td align="center" valign="middle" >2.4</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >73</td><td align="center" valign="middle" >1.403</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >111</td><td align="center" valign="middle" >1.248</td><td align="center" valign="middle" >2.8</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >36</td><td align="center" valign="middle" >1.078</td><td align="center" valign="middle" >2.4</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >74</td><td align="center" valign="middle" >1.422</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >112</td><td align="center" valign="middle" >1.141</td><td align="center" valign="middle" >2.8</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >37</td><td align="center" valign="middle" >1.169</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >75</td><td align="center" valign="middle" >1.387</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >113</td><td align="center" valign="middle" >1.279</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >3.2</td></tr><tr><td align="center" valign="middle" >38</td><td align="center" valign="middle" >0.836</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >0.8</td><td align="center" valign="middle" >76</td><td align="center" valign="middle" >1.261</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >2.0</td><td align="center" valign="middle" >114</td><td align="center" valign="middle" >1.188</td><td align="center" valign="middle" >3.2</td><td align="center" valign="middle" >3.2</td></tr></tbody></table></table-wrap></sec></body><back><ref-list><title>References</title><ref id="scirp.63884-ref1"><label>1</label><mixed-citation publication-type="other" xlink:type="simple">Kmenta, J. (2011) Elements of Econometrics. 2nd Edition, The University of Michigan Press, Ann Arbor.</mixed-citation></ref><ref id="scirp.63884-ref2"><label>2</label><mixed-citation publication-type="other" xlink:type="simple">Stock, J.H. and Watson, M.W. (2011) Introduction to Econometrics. 3rd Edition, Addison-Wesley, Boston.</mixed-citation></ref><ref id="scirp.63884-ref3"><label>3</label><mixed-citation publication-type="other" xlink:type="simple">Brooke, A., Kendrick, D. and Meeraus, A. (1988) GAMS—A User’s Guide. The Scientific Press, Redwood City.</mixed-citation></ref><ref id="scirp.63884-ref4"><label>4</label><mixed-citation publication-type="other" xlink:type="simple">Paris, Q. (2015) The Dual of the Least-Squares Method. Open Journal of Statistics, 5, 658-664.&lt;/br&gt;http://dx.doi.org/10.4236/ojs.2015.57067</mixed-citation></ref><ref id="scirp.63884-ref5"><label>5</label><mixed-citation publication-type="other" xlink:type="simple">Heady, E.O., Pesek, J.T. and Brown, W.G. (1955) Corn Response Surfaces and Economic Optima in Fertilizer Use. Iowa State Experimental Station, Bulletin 424.</mixed-citation></ref></ref-list></back></article>