blob: 0e68403fbfb2d2ad3a04a5299bc7ee5c6acc1ea9 [file] [log] [blame]
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<title>MADlib: Logistic Regression</title>
<link href="tabs.css" rel="stylesheet" type="text/css"/>
<link href="doxygen.css" rel="stylesheet" type="text/css" />
<link href="navtree.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="jquery.js"></script>
<script type="text/javascript" src="resize.js"></script>
<script type="text/javascript" src="navtree.js"></script>
<script type="text/javascript">
$(document).ready(initResizable);
</script>
<link href="search/search.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="search/search.js"></script>
<script type="text/javascript">
$(document).ready(function() { searchBox.OnSelectItem(0); });
</script>
<script src="../mathjax/MathJax.js">
MathJax.Hub.Config({
extensions: ["tex2jax.js", "TeX/AMSmath.js", "TeX/AMSsymbols.js"],
jax: ["input/TeX","output/HTML-CSS"],
});
</script>
</head>
<body>
<div id="top"><!-- do not remove this div! -->
<div id="titlearea">
<table cellspacing="0" cellpadding="0">
<tbody>
<tr style="height: 56px;">
<td style="padding-left: 0.5em;">
<div id="projectname">MADlib
&#160;<span id="projectnumber">0.6</span> <span style="font-size:10pt; font-style:italic"><a href="../latest/./group__grp__logreg.html"> A newer version is available</a></span>
</div>
<div id="projectbrief">User Documentation</div>
</td>
</tr>
</tbody>
</table>
</div>
<!-- Generated by Doxygen 1.7.5.1 -->
<script type="text/javascript">
var searchBox = new SearchBox("searchBox", "search",false,'Search');
</script>
<script type="text/javascript" src="dynsections.js"></script>
<div id="navrow1" class="tabs">
<ul class="tablist">
<li><a href="index.html"><span>Main&#160;Page</span></a></li>
<li><a href="modules.html"><span>Modules</span></a></li>
<li><a href="files.html"><span>Files</span></a></li>
<li>
<div id="MSearchBox" class="MSearchBoxInactive">
<span class="left">
<img id="MSearchSelect" src="search/mag_sel.png"
onmouseover="return searchBox.OnSearchSelectShow()"
onmouseout="return searchBox.OnSearchSelectHide()"
alt=""/>
<input type="text" id="MSearchField" value="Search" accesskey="S"
onfocus="searchBox.OnSearchFieldFocus(true)"
onblur="searchBox.OnSearchFieldFocus(false)"
onkeyup="searchBox.OnSearchFieldChange(event)"/>
</span><span class="right">
<a id="MSearchClose" href="javascript:searchBox.CloseResultsWindow()"><img id="MSearchCloseImg" border="0" src="search/close.png" alt=""/></a>
</span>
</div>
</li>
</ul>
</div>
</div>
<div id="side-nav" class="ui-resizable side-nav-resizable">
<div id="nav-tree">
<div id="nav-tree-contents">
</div>
</div>
<div id="splitbar" style="-moz-user-select:none;"
class="ui-resizable-handle">
</div>
</div>
<script type="text/javascript">
initNavTree('group__grp__logreg.html','');
</script>
<div id="doc-content">
<div class="header">
<div class="headertitle">
<div class="title">Logistic Regression</div> </div>
<div class="ingroups"><a class="el" href="group__grp__suplearn.html">Supervised Learning</a></div></div>
<div class="contents">
<div id="dynsection-0" onclick="return toggleVisibility(this)" class="dynheader closed" style="cursor:pointer;">
<img id="dynsection-0-trigger" src="closed.png" alt="+"/> Collaboration diagram for Logistic Regression:</div>
<div id="dynsection-0-summary" class="dynsummary" style="display:block;">
</div>
<div id="dynsection-0-content" class="dyncontent" style="display:none;">
<center><table><tr><td><div class="center"><iframe scrolling="no" frameborder="0" src="group__grp__logreg.svg" width="358" height="40"><p><b>This browser is not able to show SVG: try Firefox, Chrome, Safari, or Opera instead.</b></p></iframe>
</div>
</td></tr></table></center>
</div>
<dl class="user"><dt><b>About:</b></dt><dd></dd></dl>
<p>(Binomial) Logistic regression refers to a stochastic model in which the conditional mean of the dependent dichotomous variable (usually denoted \( Y \in \{ 0,1 \} \)) is the logistic function of an affine function of the vector of independent variables (usually denoted \( \boldsymbol x \)). That is, </p>
<p class="formulaDsp">
\[ E[Y \mid \boldsymbol x] = \sigma(\boldsymbol c^T \boldsymbol x) \]
</p>
<p> for some unknown vector of coefficients \( \boldsymbol c \) and where \( \sigma(x) = \frac{1}{1 + \exp(-x)} \) is the logistic function. Logistic regression finds the vector of coefficients \( \boldsymbol c \) that maximizes the likelihood of the observations.</p>
<p>Let</p>
<ul>
<li>\( \boldsymbol y \in \{ 0,1 \}^n \) denote the vector of observed dependent variables, with \( n \) rows, containing the observed values of the dependent variable,</li>
<li>\( X \in \mathbf R^{n \times k} \) denote the design matrix with \( k \) columns and \( n \) rows, containing all observed vectors of independent variables \( \boldsymbol x_i \) as rows.</li>
</ul>
<p>By definition, </p>
<p class="formulaDsp">
\[ P[Y = y_i | \boldsymbol x_i] = \sigma((-1)^{y_i} \cdot \boldsymbol c^T \boldsymbol x_i) \,. \]
</p>
<p> Maximizing the likelihood \( \prod_{i=1}^n \Pr(Y = y_i \mid \boldsymbol x_i) \) is equivalent to maximizing the log-likelihood \( \sum_{i=1}^n \log \Pr(Y = y_i \mid \boldsymbol x_i) \), which simplifies to </p>
<p class="formulaDsp">
\[ l(\boldsymbol c) = -\sum_{i=1}^n \log(1 + \exp((-1)^{y_i} \cdot \boldsymbol c^T \boldsymbol x_i)) \,. \]
</p>
<p> The Hessian of this objective is \( H = -X^T A X \) where \( A = \text{diag}(a_1, \dots, a_n) \) is the diagonal matrix with \( a_i = \sigma(\boldsymbol c^T \boldsymbol x) \cdot \sigma(-\boldsymbol c^T \boldsymbol x) \,. \) Since \( H \) is non-positive definite, \( l(\boldsymbol c) \) is convex. There are many techniques for solving convex optimization problems. Currently, logistic regression in MADlib can use one of three algorithms:</p>
<ul>
<li>Iteratively Reweighted Least Squares</li>
<li>A conjugate-gradient approach, also known as Fletcher-Reeves method in the literature, where we use the Hestenes-Stiefel rule for calculating the step size.</li>
<li>Incremental gradient descent, also known as incremental gradient methods or stochastic gradient descent in the literature.</li>
</ul>
<p>We estimate the standard error for coefficient \( i \) as </p>
<p class="formulaDsp">
\[ \mathit{se}(c_i) = \left( (X^T A X)^{-1} \right)_{ii} \,. \]
</p>
<p> The Wald z-statistic is </p>
<p class="formulaDsp">
\[ z_i = \frac{c_i}{\mathit{se}(c_i)} \,. \]
</p>
<p>The Wald \( p \)-value for coefficient \( i \) gives the probability (under the assumptions inherent in the Wald test) of seeing a value at least as extreme as the one observed, provided that the null hypothesis ( \( c_i = 0 \)) is true. Letting \( F \) denote the cumulative density function of a standard normal distribution, the Wald \( p \)-value for coefficient \( i \) is therefore </p>
<p class="formulaDsp">
\[ p_i = \Pr(|Z| \geq |z_i|) = 2 \cdot (1 - F( |z_i| )) \]
</p>
<p> where \( Z \) is a standard normally distributed random variable.</p>
<p>The odds ratio for coefficient \( i \) is estimated as \( \exp(c_i) \).</p>
<p>The condition number is computed as \( \kappa(X^T A X) \) during the iteration immediately <em>preceding</em> convergence (i.e., \( A \) is computed using the coefficients of the previous iteration). A large condition number (say, more than 1000) indicates the presence of significant multicollinearity.</p>
<dl class="user"><dt><b>Input:</b></dt><dd></dd></dl>
<p>The training data is expected to be of the following form:<br/>
</p>
<pre>{TABLE|VIEW} <em>sourceName</em> (
...
<em>dependentVariable</em> BOOLEAN,
<em>independentVariables</em> FLOAT8[],
...
)</pre><dl class="user"><dt><b>Usage:</b></dt><dd><ul>
<li>Get vector of coefficients \( \boldsymbol c \) and all diagnostic statistics:<br/>
<pre>SELECT <a class="el" href="logistic_8sql__in.html#a32880a39de2e36b6c6be72691a6a4a40">logregr_train</a>(
'<em>sourceName</em>', '<em>outName</em>', '<em>dependentVariable</em>',
'<em>independentVariables</em>'[, '<em>grouping_columns</em>',
[, <em>numberOfIterations</em> [, '<em>optimizer</em>' [, <em>precision</em>
[, <em>verbose</em> ]] ] ] ]
);</pre> Output table: <pre>coef | log_likelihood | std_err | z_stats | p_values | odds_ratios | condition_no | num_iterations
-----+----------------+---------+---------+----------+-------------+--------------+---------------
...
</pre></li>
<li>Get vector of coefficients \( \boldsymbol c \):<br/>
<pre>SELECT coef from outName; </pre></li>
<li>Get a subset of the output columns, e.g., only the array of coefficients \( \boldsymbol c \), the log-likelihood of determination \( l(\boldsymbol c) \), and the array of p-values \( \boldsymbol p \): <pre>SELECT coef, log_likelihood, p_values FROM outName; </pre></li>
<li>By default, the option <em>verbose</em> is False. If it is set to be True, warning messages will be output to the SQL client for groups that failed.</li>
</ul>
</dd></dl>
<dl class="user"><dt><b>Examples:</b></dt><dd></dd></dl>
<ol type="1">
<li>Create the sample data set: <div class="fragment"><pre class="fragment">
sql&gt; SELECT * FROM data;
r1 | val
---------------------------------------------+-----
{1,3.01789340097457,0.454183579888195} | t
{1,-2.59380532894284,0.602678326424211} | f
{1,-1.30643094424158,0.151587064377964} | t
{1,3.60722299199551,0.963550757616758} | t
{1,-1.52197745628655,0.0782248834148049} | t
{1,-4.8746574902907,0.345104880165309} | f
...
</pre></div></li>
<li>Run the logistic regression function: <div class="fragment"><pre class="fragment">
sql&gt; \x on
Expanded display is off.
sql&gt; SELECT logregr_train('data', 'out_tbl', 'val', 'r1', Null, 100, 'irls', 0.001);
sql&gt; SELECT * from out_tbl;
coef | {5.59049410898112,2.11077546770772,-0.237276684606453}
log_likelihood | -467.214718489873
std_err | {0.318943457652178,0.101518723785383,0.294509929481773}
z_stats | {17.5281667482197,20.7919819024719,-0.805666162169712}
p_values | {8.73403463417837e-69,5.11539430631541e-96,0.420435365338518}
odds_ratios | {267.867942976278,8.2546400100702,0.788773016471171}
condition_no | 179.186118573205
num_iterations | 9
</pre></div></li>
</ol>
<dl class="user"><dt><b>Literature:</b></dt><dd></dd></dl>
<p>A somewhat random selection of nice write-ups, with valuable pointers into further literature:</p>
<p>[1] Cosma Shalizi: Statistics 36-350: Data Mining, Lecture Notes, 18 November 2009, <a href="http://www.stat.cmu.edu/~cshalizi/350/lectures/26/lecture-26.pdf">http://www.stat.cmu.edu/~cshalizi/350/lectures/26/lecture-26.pdf</a></p>
<p>[2] Thomas P. Minka: A comparison of numerical optimizers for logistic regression, 2003 (revised Mar 26, 2007), <a href="http://research.microsoft.com/en-us/um/people/minka/papers/logreg/minka-logreg.pdf">http://research.microsoft.com/en-us/um/people/minka/papers/logreg/minka-logreg.pdf</a></p>
<p>[3] Paul Komarek, Andrew W. Moore: Making Logistic Regression A Core Data Mining Tool With TR-IRLS, IEEE International Conference on Data Mining 2005, pp. 685-688, <a href="http://komarix.org/ac/papers/tr-irls.short.pdf">http://komarix.org/ac/papers/tr-irls.short.pdf</a></p>
<p>[4] D. P. Bertsekas: Incremental gradient, subgradient, and proximal methods for convex optimization: a survey, Technical report, Laboratory for Information and Decision Systems, 2010, <a href="http://web.mit.edu/dimitrib/www/Incremental_Survey_LIDS.pdf">http://web.mit.edu/dimitrib/www/Incremental_Survey_LIDS.pdf</a></p>
<p>[5] A. Nemirovski, A. Juditsky, G. Lan, and A. Shapiro: Robust stochastic approximation approach to stochastic programming, SIAM Journal on Optimization, 19(4), 2009, <a href="http://www2.isye.gatech.edu/~nemirovs/SIOPT_RSA_2009.pdf">http://www2.isye.gatech.edu/~nemirovs/SIOPT_RSA_2009.pdf</a></p>
<dl class="see"><dt><b>See also:</b></dt><dd>File <a class="el" href="logistic_8sql__in.html" title="SQL functions for logistic regression.">logistic.sql_in</a> (documenting the SQL functions) </dd></dl>
</div>
</div>
<div id="nav-path" class="navpath">
<ul>
<!-- window showing the filter options -->
<div id="MSearchSelectWindow"
onmouseover="return searchBox.OnSearchSelectShow()"
onmouseout="return searchBox.OnSearchSelectHide()"
onkeydown="return searchBox.OnSearchSelectKey(event)">
<a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(0)"><span class="SelectionMark">&#160;</span>All</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(1)"><span class="SelectionMark">&#160;</span>Files</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(2)"><span class="SelectionMark">&#160;</span>Functions</a></div>
<!-- iframe showing the search results (closed by default) -->
<div id="MSearchResultsWindow">
<iframe src="javascript:void(0)" frameborder="0"
name="MSearchResults" id="MSearchResults">
</iframe>
</div>
<li class="footer">Generated on Tue Apr 2 2013 14:57:03 for MADlib by
<a href="http://www.doxygen.org/index.html">
<img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.7.5.1 </li>
</ul>
</div>
</body>
</html>