blob: ed4846972208fafed05bb9e91e412500b79c6cfd [file] [log] [blame]
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<meta http-equiv="X-UA-Compatible" content="IE=9"/>
<meta name="generator" content="Doxygen 1.8.4"/>
<title>MADlib: PCA Training</title>
<link href="tabs.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="jquery.js"></script>
<script type="text/javascript" src="dynsections.js"></script>
<link href="navtree.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="resize.js"></script>
<script type="text/javascript" src="navtree.js"></script>
<script type="text/javascript">
$(document).ready(initResizable);
$(window).load(resizeHeight);
</script>
<link href="search/search.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="search/search.js"></script>
<script type="text/javascript">
$(document).ready(function() { searchBox.OnSelectItem(0); });
</script>
<script type="text/x-mathjax-config">
MathJax.Hub.Config({
extensions: ["tex2jax.js", "TeX/AMSmath.js", "TeX/AMSsymbols.js"],
jax: ["input/TeX","output/HTML-CSS"],
});
</script><script src="../mathjax/MathJax.js"></script>
<link href="doxygen.css" rel="stylesheet" type="text/css" />
</head>
<body>
<div id="top"><!-- do not remove this div, it is closed by doxygen! -->
<div id="titlearea">
<table cellspacing="0" cellpadding="0">
<tbody>
<tr style="height: 56px;">
<td style="padding-left: 0.5em;">
<div id="projectname">MADlib
&#160;<span id="projectnumber">1.1</span> <span style="font-size:10pt; font-style:italic"><a href="../latest/./group__grp__pca__train.html"> A newer version is available</a></span>
</div>
<div id="projectbrief">User Documentation</div>
</td>
</tr>
</tbody>
</table>
</div>
<!-- end header part -->
<!-- Generated by Doxygen 1.8.4 -->
<script type="text/javascript">
var searchBox = new SearchBox("searchBox", "search",false,'Search');
</script>
<div id="navrow1" class="tabs">
<ul class="tablist">
<li><a href="index.html"><span>Main&#160;Page</span></a></li>
<li><a href="modules.html"><span>Modules</span></a></li>
<li><a href="files.html"><span>Files</span></a></li>
<li>
<div id="MSearchBox" class="MSearchBoxInactive">
<span class="left">
<img id="MSearchSelect" src="search/mag_sel.png"
onmouseover="return searchBox.OnSearchSelectShow()"
onmouseout="return searchBox.OnSearchSelectHide()"
alt=""/>
<input type="text" id="MSearchField" value="Search" accesskey="S"
onfocus="searchBox.OnSearchFieldFocus(true)"
onblur="searchBox.OnSearchFieldFocus(false)"
onkeyup="searchBox.OnSearchFieldChange(event)"/>
</span><span class="right">
<a id="MSearchClose" href="javascript:searchBox.CloseResultsWindow()"><img id="MSearchCloseImg" border="0" src="search/close.png" alt=""/></a>
</span>
</div>
</li>
</ul>
</div>
</div><!-- top -->
<div id="side-nav" class="ui-resizable side-nav-resizable">
<div id="nav-tree">
<div id="nav-tree-contents">
<div id="nav-sync" class="sync"></div>
</div>
</div>
<div id="splitbar" style="-moz-user-select:none;"
class="ui-resizable-handle">
</div>
</div>
<script type="text/javascript">
$(document).ready(function(){initNavTree('group__grp__pca__train.html','');});
</script>
<div id="doc-content">
<!-- window showing the filter options -->
<div id="MSearchSelectWindow"
onmouseover="return searchBox.OnSearchSelectShow()"
onmouseout="return searchBox.OnSearchSelectHide()"
onkeydown="return searchBox.OnSearchSelectKey(event)">
<a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(0)"><span class="SelectionMark">&#160;</span>All</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(1)"><span class="SelectionMark">&#160;</span>Files</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(2)"><span class="SelectionMark">&#160;</span>Functions</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(3)"><span class="SelectionMark">&#160;</span>Groups</a></div>
<!-- iframe showing the search results (closed by default) -->
<div id="MSearchResultsWindow">
<iframe src="javascript:void(0)" frameborder="0"
name="MSearchResults" id="MSearchResults">
</iframe>
</div>
<div class="header">
<div class="headertitle">
<div class="title">PCA Training<div class="ingroups"><a class="el" href="group__grp__pca.html">Principal Component Analysis</a></div></div> </div>
</div><!--header-->
<div class="contents">
<div id="dynsection-0" onclick="return toggleVisibility(this)" class="dynheader closed" style="cursor:pointer;">
<img id="dynsection-0-trigger" src="closed.png" alt="+"/> Collaboration diagram for PCA Training:</div>
<div id="dynsection-0-summary" class="dynsummary" style="display:block;">
</div>
<div id="dynsection-0-content" class="dyncontent" style="display:none;">
<center><table><tr><td><div class="center"><iframe scrolling="no" frameborder="0" src="group__grp__pca__train.svg" width="318" height="56"><p><b>This browser is not able to show SVG: try Firefox, Chrome, Safari, or Opera instead.</b></p></iframe>
</div>
</td></tr></table></center>
</div>
<div class="toc"><b>Contents</b> </p>
<ul>
<li class="level1">
<a href="#pca_train">About</a> </li>
<li class="level1">
<a href="#help">Online Help</a> </li>
<li class="level1">
<a href="#train">Training Function</a> </li>
<li class="level1">
<a href="#output">Output Tables</a> </li>
<li class="level1">
<a href="#examples">Examples</a> </li>
<li class="level1">
<a href="#seealso">See Also</a> </li>
<li class="level1">
<a href="#background_pca">Technical Background</a> </li>
<li class="level1">
<a href="#literature">Literature</a> </li>
</ul>
</div><p><a class="anchor" id="pca_train"></a></p>
<dl class="section user"><dt>About:</dt><dd>Principal component analysis (PCA) is a mathematical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. This transformation is defined in such a way that the first principal component has the largest possible variance (i.e., accounts for as much of the variability in the data as possible), and each succeeding component in turn has the highest variance possible under the constraint that it be orthogonal to (i.e., uncorrelated with) the preceding components.</dd></dl>
<p>See the <a class="el" href="group__grp__pca__train.html#background_pca">Technical Background</a> for an introduction to principal component analysis and the implementation notes.</p>
<p><a class="anchor" id="help"></a></p>
<dl class="section user"><dt>Online Help</dt><dd>View short help messages using the following statements: <pre class="fragment">-- Summary of PCA projection
madlib.pca_train()
madlib.pca_train('?')
madlib.pca_train('help')
-- Training function syntax and output table format
madlib.pca_train('usage')
-- Summary of PCA projection with sparse matrices
madlib.pca_sparse_train()
madlib.pca_sparse_train('?')
madlib.pca_sparse_train('help')
-- Training function syntax and output table format
madlib.pca_sparse_train('usage')
</pre></dd></dl>
<p><a class="anchor" id="train"></a></p>
<dl class="section user"><dt>Training Function</dt><dd>The training functions have the following formats: <pre class="fragment">pca_project( source_table, out_table, row_id,
k, grouping_cols:= NULL,
lanczos_iter := min(k+40, &lt;smallest_matrix_dimension&gt;),
use_correlation := False, result_summary_table := NULL)
</pre> and <pre class="fragment">pca_sparse_project(source_table, out_table,
row_id, col_id, val_id, row_dim, col_dim, k,
grouping_cols := NULL,
lanczos_iter := min(k+40, &lt;smallest_matrix_dimension&gt;),
use_correlation := False, result_summary_table := NULL)
</pre></dd></dl>
<dl class="section note"><dt>Note</dt><dd>Because of the centering step in PCA (see <a class="el" href="group__grp__pca__train.html#background_pca">Technical Background</a>), sparse matrices almost always become dense during the training process. Thus, this implementation automatically densifies sparse matrix input, and there should be no expected performance improvement in using sparse matrix input over dense matrix input.</dd></dl>
<dl class="section user"><dt>Arguments</dt><dd></dd></dl>
<dl class="section user"><dt></dt><dd><dl class="arglist">
<dt>source_table </dt>
<dd><p class="startdd">Text value. Name of the input table containing the data for PCA training. The input data matrix should have \( N \) rows and \( M \) columns, where \( N \) is the number of data points, and \( M \) is the number of features for each data point.</p>
<p>A dense input table is expected to be in the one of the two standard MADlib dense matrix formats, and a sparse input table should be in the standard MADlib sparse matrix format.</p>
<p>The two standard MADlib dense matrix formats are </p>
<pre>{TABLE|VIEW} <em>source_table</em> (
<em>row_id</em> INTEGER,
row_vec FLOAT8[],
)</pre><p> and </p>
<pre>{TABLE|VIEW} <em>source_table</em> (
<em>row_id</em> INTEGER,
col1 FLOAT8,
col2 FLOAT8,
...
)</pre><p>Note that the column name <em>row_id</em> is taken as an input parameter, and should contain a list of row indices (starting at 0) for the input matrix.</p>
<p>The input table for sparse PCA is expected to be in the form:</p>
<pre>{TABLE|VIEW} <em>source_table</em> (
...
<em>row_id</em> INTEGER,
<em>col_id</em> INTEGER,
<em>val_id</em> FLOAT8,
...
)</pre><p>The <em>row_id</em> and <em>col_id</em> columns specify which entries in the matrix are nonzero, and the <em>val_id</em> column defines the values of the nonzero entries. </p>
<p class="enddd"></p>
</dd>
<dt>out_table </dt>
<dd><p class="startdd">Text value. Name of the table that will contain the principal components of the input data.</p>
<p class="enddd"></p>
</dd>
<dt>row_id </dt>
<dd><p class="startdd">Text value. Column name containing the row IDs in the input source table.</p>
<p class="enddd"></p>
</dd>
<dt>col_id </dt>
<dd><p class="startdd">Text value. Name of 'col_id' column in sparse matrix representation (sparse matrices only). </p>
<p class="enddd"></p>
</dd>
<dt>val_id </dt>
<dd><p class="startdd">Text value. Name of 'val_id' column in sparse matrix representation (sparse matrices only). </p>
<p class="enddd"></p>
</dd>
<dt>row_dim </dt>
<dd><p class="startdd">Integer value. The number of rows in the sparse matrix (sparse matrices only). </p>
<p class="enddd"></p>
</dd>
<dt>col_dim </dt>
<dd><p class="startdd">Integer value. The number of columns in the sparse matrix (sparse matrices only). </p>
<p class="enddd"></p>
</dd>
<dt>k </dt>
<dd><p class="startdd">Integer value. The number of principal components to calculate from the input data. </p>
<p class="enddd"></p>
</dd>
<dt>grouping_cols </dt>
<dd><p class="startdd">Text value. Currently <em>grouping_cols</em> is present as a placeholder for forward compatibility. The parameter is planned to be implemented as a comma-separated list of column names, with the source data grouped using the combination of all the columns. An independent PCA model will be computed for each combination of the grouping columns. Default: NULL.</p>
<p class="enddd"></p>
</dd>
<dt>lanczos_iter </dt>
<dd><p class="startdd">Integer value. The number of Lanczos iterations for the SVD calculation. The Lanczos iteration number roughly corresponds to the accuracy of the SVD calculation, and a higher iteration number corresponds to greater accuracy but longer computation time. The number of iterations must be at least as large as the value of <em>k</em>, but no larger than the smallest dimension of the matrix. If the iteration number is given as zero, then the default number of iterations is used. Default: minimum of {k+40, smallest matrix dimension}.</p>
<p class="enddd"></p>
</dd>
<dt>use_correlation </dt>
<dd><p class="startdd">Boolean value. Whether to use the correlation matrix for calculating the principal components instead of the covariance matrix. Currently <em>use_correlation</em> is a placeholder for forward compatibility, and this value must be set to false. Default: False. </p>
<p class="enddd"></p>
</dd>
<dt>result_summary_table </dt>
<dd>Text value. Name of the optional summary table. Default: NULL. </dd>
</dl>
</dd></dl>
<p><a class="anchor" id="output"></a></p>
<dl class="section user"><dt>Output Tables</dt><dd></dd></dl>
<p>The output is divided into three tables (one of which is optional). The output table (<em>'out_table'</em> above) encodes the principal components with the</p>
<p><em>k</em> highest eigenvalues. The table has the following columns: </p>
<dl class="section user"><dt></dt><dd><dl class="arglist">
<dt>row_id </dt>
<dd><p class="startdd">Eigenvalue rank in descending order of the eigenvalue size.</p>
<p class="enddd"></p>
</dd>
<dt>principal_components </dt>
<dd><p class="startdd">Vectors containing elements of the principal components.</p>
<p class="enddd"></p>
</dd>
<dt>eigen_values </dt>
<dd>The eigenvalues associated with each principal component. </dd>
</dl>
</dd></dl>
<p>In addition to the output table, a table containing the column means is also generated. This table has the same name as the output table, with the string "_mean" appended to the end. This table has only one column: </p>
<dl class="section user"><dt></dt><dd><dl class="arglist">
<dt>column_mean </dt>
<dd>A vector containing the column means for the input matrix. </dd>
</dl>
</dd></dl>
<p>The optional summary table contains information about the performance of the PCA. This table has the following columns: </p>
<dl class="section user"><dt></dt><dd><dl class="arglist">
<dt>rows_used </dt>
<dd>Number of data points in the input. </dd>
<dt>exec_time (ms) </dt>
<dd>Number of milliseconds for the PCA calculation to run. </dd>
<dt>iter </dt>
<dd>Number of iterations used in the SVD calculation. </dd>
<dt>recon_error </dt>
<dd>The absolute error in the SVD approximation. </dd>
<dt>relative_recon_error </dt>
<dd>The relative error in the SVD approximation. </dd>
<dt>use_correlation </dt>
<dd>Indicates if the correlation matrix was used. </dd>
</dl>
</dd></dl>
<p><a class="anchor" id="examples"></a></p>
<dl class="section user"><dt>Examples:</dt><dd><ol type="1">
<li>Create the sample data. <pre class="fragment">sql&gt; DROP TABLE IF EXISTS mat;
CREATE TABLE mat (
row_id integer,
row_vec double precision[]
);
sql&gt; COPY mat (row_id, row_vec) FROM stdin;
0 {1,2,3}
1 {2,1,2}
2 {3,2,1}
\.</pre></li>
<li>Run the PCA function: <pre class="fragment">sql&gt; drop table result_table;
sql&gt; select pca_train(
'mat', -- name of the input table
'result_table', -- name of the output table
'row_id', -- column containing the matrix indices
3 -- Number of PCA components to compute
);
</pre></li>
<li>View the PCA results: <pre class="fragment">sql&gt; SELECT * from result_table;
row_id | principal_components | eigen_values
--------+--------------------------------------------------------------+----------------------
0 | {0.707106781186547,0.408248290459781,-0.577350269192513} | 2
2 | {-0.707106781186547,0.408248290459781,-0.577350269192512} | 1.26294130828989e-08
1 | {2.08166817117217e-17,-0.816496580931809,-0.577350269183852} | 0.816496580927726</pre></li>
</ol>
</dd></dl>
<p><a class="anchor" id="seealso"></a></p>
<dl class="section see"><dt>See Also</dt><dd>File <a class="el" href="pca_8sql__in.html" title="Principal Component Analysis. ">pca.sql_in</a> documenting the SQL functions </dd>
<dd>
<a class="el" href="group__grp__pca__project.html">PCA Projection</a></dd></dl>
<p><a class="anchor" id="background_pca"></a></p>
<dl class="section user"><dt>Technical Background</dt><dd></dd></dl>
<p>The PCA implemented here uses an SVD decomposition implementation to recover the principle components (as opposed to the directly computing the eigenvectors of the covariance matrix). Let \( \boldsymbol X \) be the data matrix, and let \( \hat{x} \) be a vector of the column averages of \( \boldsymbol{X}\). PCA computes the matrix \( \hat{\boldsymbol X} \) as </p>
<p class="formulaDsp">
\[ \hat{\boldsymbol X} = {\boldsymbol X} - \vec{e} \hat{x}^T \]
</p>
<p> where \( \vec{e} \) is the vector of all ones.</p>
<p>PCA then computes the SVD matrix factorization </p>
<p class="formulaDsp">
\[ \hat{\boldsymbol X} = {\boldsymbol U}{\boldsymbol \Sigma}{\boldsymbol V}^T \]
</p>
<p> where \( {\boldsymbol \Sigma} \) is a diagonal matrix. The eigenvalues are recovered as the entries of \( {\boldsymbol \Sigma}/(\sqrt{N-1}) \), and the principle components are the rows of \( {\boldsymbol V} \).</p>
<p>It is important to note that the PCA implementation assumes that the user will use only the principle components that have non-zero eigenvalues. The SVD calculation is done with the Lanczos method, with does not guarantee correctness for singular vectors with zero-valued eigenvalues. Consequently, principle components with zero-valued eigenvalues are not guaranteed to be correct. Generally, this will not be problem unless the user wants to use the principle components for the entire eigenspectrum.</p>
<p><a class="anchor" id="literature"></a></p>
<dl class="section user"><dt>Literature:</dt><dd></dd></dl>
<p>[1] Principal Component Analysis. <a href="http://en.wikipedia.org/wiki/Principal_component_analysis">http://en.wikipedia.org/wiki/Principal_component_analysis</a></p>
<p>[2] Shlens, Jonathon (2009), A Tutorial on Principal Component Analysis </p>
</div><!-- contents -->
</div><!-- doc-content -->
<!-- start footer part -->
<div id="nav-path" class="navpath"><!-- id is needed for treeview function! -->
<ul>
<li class="footer">Generated on Wed Aug 21 2013 16:09:52 for MADlib by
<a href="http://www.doxygen.org/index.html">
<img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.8.4 </li>
</ul>
</div>
</body>
</html>