blob: d28a3c1af580ff5de5c3b8692fb9e46d60655bb5 [file] [log] [blame]
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<title>MADlib: Support Vector Machines</title>
<link href="tabs.css" rel="stylesheet" type="text/css"/>
<link href="doxygen.css" rel="stylesheet" type="text/css" />
<link href="navtree.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="jquery.js"></script>
<script type="text/javascript" src="resize.js"></script>
<script type="text/javascript" src="navtree.js"></script>
<script type="text/javascript">
$(document).ready(initResizable);
</script>
<link href="search/search.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="search/search.js"></script>
<script type="text/javascript">
$(document).ready(function() { searchBox.OnSelectItem(0); });
</script>
<script src="../mathjax/MathJax.js">
MathJax.Hub.Config({
extensions: ["tex2jax.js", "TeX/AMSmath.js", "TeX/AMSsymbols.js"],
jax: ["input/TeX","output/HTML-CSS"],
});
</script>
</head>
<body>
<div id="top"><!-- do not remove this div! -->
<div id="titlearea">
<table cellspacing="0" cellpadding="0">
<tbody>
<tr style="height: 56px;">
<td style="padding-left: 0.5em;">
<div id="projectname">MADlib
&#160;<span id="projectnumber">0.6</span> <span style="font-size:10pt; font-style:italic"><a href="../latest/./group__grp__kernmach.html"> A newer version is available</a></span>
</div>
<div id="projectbrief">User Documentation</div>
</td>
</tr>
</tbody>
</table>
</div>
<!-- Generated by Doxygen 1.7.5.1 -->
<script type="text/javascript">
var searchBox = new SearchBox("searchBox", "search",false,'Search');
</script>
<script type="text/javascript" src="dynsections.js"></script>
<div id="navrow1" class="tabs">
<ul class="tablist">
<li><a href="index.html"><span>Main&#160;Page</span></a></li>
<li><a href="modules.html"><span>Modules</span></a></li>
<li><a href="files.html"><span>Files</span></a></li>
<li>
<div id="MSearchBox" class="MSearchBoxInactive">
<span class="left">
<img id="MSearchSelect" src="search/mag_sel.png"
onmouseover="return searchBox.OnSearchSelectShow()"
onmouseout="return searchBox.OnSearchSelectHide()"
alt=""/>
<input type="text" id="MSearchField" value="Search" accesskey="S"
onfocus="searchBox.OnSearchFieldFocus(true)"
onblur="searchBox.OnSearchFieldFocus(false)"
onkeyup="searchBox.OnSearchFieldChange(event)"/>
</span><span class="right">
<a id="MSearchClose" href="javascript:searchBox.CloseResultsWindow()"><img id="MSearchCloseImg" border="0" src="search/close.png" alt=""/></a>
</span>
</div>
</li>
</ul>
</div>
</div>
<div id="side-nav" class="ui-resizable side-nav-resizable">
<div id="nav-tree">
<div id="nav-tree-contents">
</div>
</div>
<div id="splitbar" style="-moz-user-select:none;"
class="ui-resizable-handle">
</div>
</div>
<script type="text/javascript">
initNavTree('group__grp__kernmach.html','');
</script>
<div id="doc-content">
<div class="header">
<div class="headertitle">
<div class="title">Support Vector Machines</div> </div>
<div class="ingroups"><a class="el" href="group__grp__suplearn.html">Supervised Learning</a></div></div>
<div class="contents">
<div id="dynsection-0" onclick="return toggleVisibility(this)" class="dynheader closed" style="cursor:pointer;">
<img id="dynsection-0-trigger" src="closed.png" alt="+"/> Collaboration diagram for Support Vector Machines:</div>
<div id="dynsection-0-summary" class="dynsummary" style="display:block;">
</div>
<div id="dynsection-0-content" class="dyncontent" style="display:none;">
<center><table><tr><td><div class="center"><iframe scrolling="no" frameborder="0" src="group__grp__kernmach.svg" width="392" height="40"><p><b>This browser is not able to show SVG: try Firefox, Chrome, Safari, or Opera instead.</b></p></iframe>
</div>
</td></tr></table></center>
</div>
<dl class="user"><dt><b>About:</b></dt><dd></dd></dl>
<p>Support vector machines (SVMs) and related kernel methods have been one of the most popular and well-studied machine learning techniques of the past 15 years, with an amazing number of innovations and applications.</p>
<p>In a nutshell, an SVM model \(f(x)\) takes the form of </p>
<p class="formulaDsp">
\[ f(x) = \sum_i \alpha_i k(x_i,x), \]
</p>
<p> where each \( \alpha_i \) is a real number, each \( \boldsymbol x_i \) is a data point from the training set (called a support vector), and \( k(\cdot, \cdot) \) is a kernel function that measures how "similar" two objects are. In regression, \( f(\boldsymbol x) \) is the regression function we seek. In classification, \( f(\boldsymbol x) \) serves as the decision boundary; so for example in binary classification, the predictor can output class 1 for object \(x\) if \( f(\boldsymbol x) \geq 0 \), and class 2 otherwise.</p>
<p>In the case when the kernel function \( k(\cdot, \cdot) \) is the standard inner product on vectors, \( f(\boldsymbol x) \) is just an alternative way of writing a linear function </p>
<p class="formulaDsp">
\[ f&#39;(\boldsymbol x) = \langle \boldsymbol w, \boldsymbol x \rangle, \]
</p>
<p> where \( \boldsymbol w \) is a weight vector having the same dimension as \( \boldsymbol x \). One of the key points of SVMs is that we can use more fancy kernel functions to efficiently learn linear models in high-dimensional feature spaces, since \( k(\boldsymbol x_i, \boldsymbol x_j) \) can be understood as an efficient way of computing an inner product in the feature space: </p>
<p class="formulaDsp">
\[ k(\boldsymbol x_i, \boldsymbol x_j) = \langle \phi(\boldsymbol x_i), \phi(\boldsymbol x_j) \rangle, \]
</p>
<p> where \( \phi(\boldsymbol x) \) projects \( \boldsymbol x \) into a (possibly infinite-dimensional) feature space.</p>
<p>There are many algorithms for learning kernel machines. This module implements the class of online learning with kernels algorithms described in Kivinen et al. [1]. It also includes the Stochastic Gradient Descent (SGD) method [3] for learning linear SVMs with the Hinge loss \(l(z) = \max(0, 1-z)\). See also the book Scholkopf and Smola [2] for much more details.</p>
<p>The SGD implementation is based on L&eacute;on Bottou's SGD package (<a href="http://leon.bottou.org/projects/sgd">http://leon.bottou.org/projects/sgd</a>). The methods introduced in [1] are implemented according to their original descriptions, except that we only update the support vector model when we make a significant error. The original algorithms in [1] update the support vector model at every step, even when no error was made, in the name of regularisation. For practical purposes, and this is verified empirically to a certain degree, updating only when necessary is both faster and better from a learning-theoretic point of view, at least in the i.i.d. setting.</p>
<p>Methods for classification, regression and novelty detection are available. Multiple instances of the algorithms can be executed in parallel on different subsets of the training data. The resultant support vector models can then be combined using standard techniques like averaging or majority voting.</p>
<p>Training data points are accessed via a table or a view. The support vector models can also be stored in tables for fast execution.</p>
<dl class="user"><dt><b>Input:</b></dt><dd>For classification and regression, the training table/view is expected to be of the following form (the array size of <em>ind</em> must not be greater than 102,400.):<br/>
<pre>{TABLE|VIEW} <em>input_table</em> (
...
<em>id</em> INT,
<em>ind</em> FLOAT8[],
<em>label</em> FLOAT8,
...
)</pre> For novelty detection, the label field is not required.</dd></dl>
<dl class="user"><dt><b>Usage:</b></dt><dd></dd></dl>
<ul>
<li>Regression learning is achieved through the following function: <pre>SELECT <a class="el" href="online__sv_8sql__in.html#ac5cb9c20d6620b155ac872576a056f2a">svm_regression</a>(
'<em>input_table</em>', '<em>model_table</em>', <em>parallel</em>, '<em>kernel_func</em>',
<em>verbose DEFAULT false</em>, <em>eta DEFAULT 0.1</em>, <em>nu DEFAULT 0.005</em>, <em>slambda DEFAULT 0.05</em>
);</pre></li>
</ul>
<ul>
<li>Classification learning is achieved through the following two functions:<ol type="a">
<li>Learn linear SVM(s) using SGD [3]: <pre>SELECT <a class="el" href="online__sv_8sql__in.html#a50896def00d0e0950bec3d95b387e6b9">lsvm_classification</a>(
'<em>input_table</em>', '<em>model_table</em>', <em>parallel</em>,
<em>verbose DEFAULT false</em>, <em>eta DEFAULT 0.1</em>, <em>reg DEFAULT 0.001</em>
);</pre></li>
<li>Learn linear or non-linear SVM(s) using the method described in [1]: <pre>SELECT <a class="el" href="online__sv_8sql__in.html#ad90b6bf3b807f22d37b0e2b1893262f0">svm_classification</a>(
'<em>input_table</em>', '<em>model_table</em>', <em>parallel</em>, '<em>kernel_func</em>',
<em>verbose DEFAULT false</em>, <em>eta DEFAULT 0.1</em>, <em>nu DEFAULT 0.005</em>
);</pre></li>
</ol>
</li>
</ul>
<ul>
<li>Novelty detection is achieved through the following function: <pre>SELECT <a class="el" href="online__sv_8sql__in.html#a5bae5335b51e448cd7fb9cb7a54b0bfa">svm_novelty_detection</a>(
'<em>input_table</em>', '<em>model_table</em>', <em>parallel</em>, '<em>kernel_func</em>',
<em>verbose DEFAULT false</em>, <em>eta DEFAULT 0.1</em>, <em>nu DEFAULT 0.005</em>
);</pre> Assuming the model_table parameter takes on value 'model', each learning function will produce two tables as output: 'model' and 'model_param'. The first contains the support vectors of the model(s) learned. The second contains the parameters of the model(s) learned, which includes information like the kernel function used and the value of the intercept, if there is one.</li>
</ul>
<ul>
<li>To make predictions on a single data point x using a single model learned previously, we use the function <pre>SELECT <a class="el" href="online__sv_8sql__in.html#a9916305653d464b23ef0fbd78867a654">svm_predict</a>('<em>model_table</em>',<em>x</em>);</pre> If the model is produced by the <a class="el" href="online__sv_8sql__in.html#a75d126981ae4bf2e6641627501f0a2a5" title="This is the linear support vector classification function.">lsvm_classification()</a> function, use the following prediction function instead <pre>SELECT <a class="el" href="online__sv_8sql__in.html#a5fe084c8364c0657097410458f8ea1e9">lsvm_predict</a>('<em>model_table</em>',<em>x</em>);</pre></li>
</ul>
<ul>
<li>To make predictions on new data points using multiple models learned in parallel, we use the function <pre>SELECT <a class="el" href="online__sv_8sql__in.html#a883ff4ca340d19a11204b461dd388276">svm_predict_combo</a>('<em>model_table</em>',<em>x</em>);</pre> If the models are produced by the <a class="el" href="online__sv_8sql__in.html#a75d126981ae4bf2e6641627501f0a2a5" title="This is the linear support vector classification function.">lsvm_classification()</a> function, use the following prediction function instead <pre>SELECT <a class="el" href="online__sv_8sql__in.html#a0ae9c50ca072757ff6493a8bf26dbc9c">lsvm_predict_combo</a>('<em>model_table</em>',<em>x</em>);</pre></li>
</ul>
<ul>
<li>Note that, at the moment, we cannot use MADlib.svm_predict() and MADlib.svm_predict_combo() on multiple data points. For example, something like the following will fail: <pre>SELECT <a class="el" href="online__sv_8sql__in.html#a9916305653d464b23ef0fbd78867a654">svm_predict</a>('<em>model_table</em>',<em>x</em>) FROM data_table;</pre> Instead, to make predictions on new data points stored in a table using previously learned models, we use the function: <pre>SELECT <a class="el" href="online__sv_8sql__in.html#a91ac71354e9dec74e25339bf168c2e5b">svm_predict_batch</a>('<em>input_table</em>', '<em>data_col</em>', '<em>id_col</em>', '<em>model_table</em>', '<em>output_table</em>', <em>parallel</em>);</pre> The output_table is created during the function call; an existing table with the same name will be dropped. If the parallel parameter is true, then each data point in the input table will have multiple predicted values corresponding to the number of models learned in parallel.<br/>
<br/>
Similarly, use the following function for batch prediction if the model(s) is produced by the <a class="el" href="online__sv_8sql__in.html#a75d126981ae4bf2e6641627501f0a2a5" title="This is the linear support vector classification function.">lsvm_classification()</a> function: <pre>SELECT <a class="el" href="online__sv_8sql__in.html#a1c0a002f50250133c0ef1d3c43c6d338">lsvm_predict_batch</a>('<em>input_table</em>', '<em>data_col</em>', '<em>id_col</em>', '<em>model_table</em>','<em>output_table</em>', <em>parallel</em>);</pre></li>
</ul>
<dl class="user"><dt><b>Implementation Notes:</b></dt><dd></dd></dl>
<p>Currently, three kernel functions have been implemented: dot product (<a class="el" href="online__sv_8sql__in.html#acc2d778a8eb48ab775ff9c1dff4a3141">svm_dot</a>), polynomial (<a class="el" href="online__sv_8sql__in.html#a1ac76fdf9623e0a4db47665f2a80be90">svm_polynomial</a>) and Gaussian (<a class="el" href="online__sv_8sql__in.html#a9f2a96e1a241ecc66386a78b110777d3">svm_gaussian</a>) kernels. To use the dot product kernel function, simply use '<code><em>MADlib.svm_dot</em></code>' as the <code>kernel_func</code> argument, which accepts any function that takes in two float[] and returns a float. To use the polynomial or Gaussian kernels, a wrapper function is needed since these kernels require additional input parameters (see <a class="el" href="online__sv_8sql__in.html" title="SQL functions for support vector machines.">online_sv.sql_in</a> for input parameters).</p>
<p>For example, to use the polynomial kernel with degree 2, first create a wrapper function: </p>
<pre>CREATE OR REPLACE FUNCTION mykernel(FLOAT[],FLOAT[]) RETURNS FLOAT AS $$
SELECT <a class="el" href="online__sv_8sql__in.html#a1ac76fdf9623e0a4db47665f2a80be90">svm_polynomial</a>($1,$2,2)
$$ language sql;</pre><p> Then call the SVM learning functions with <code>mykernel</code> as the argument to <code>kernel_func</code>. </p>
<pre>SELECT <a class="el" href="online__sv_8sql__in.html#ac5cb9c20d6620b155ac872576a056f2a">svm_regression</a>('my_schema.my_train_data', 'mymodel', false, 'mykernel');</pre><p>To drop all tables pertaining to the model, we can use </p>
<pre>SELECT <a class="el" href="online__sv_8sql__in.html#ab54d33f13c0e00faa358e3e3f17c10fb">svm_drop_model</a>('model_table');</pre><dl class="user"><dt><b>Examples:</b></dt><dd></dd></dl>
<p>As a general first step, we need to prepare and populate an input table/view with the following structure: </p>
<div class="fragment"><pre class="fragment">TABLE/VIEW my_schema.my_input_table
(
<span class="keywordtype">id</span> INT, -- point ID
ind FLOAT8[], -- data point
label FLOAT8 -- label of data point
);
</pre></div><p> Note: The label field is not required for novelty detection.</p>
<p><b>Example usage for regression</b>:</p>
<ol type="1">
<li>We can randomly generate 1000 5-dimensional data labelled by the simple target function <div class="fragment"><pre class="fragment">t(x) = <span class="keywordflow">if</span> x[5] = 10 then 50 <span class="keywordflow">else</span> <span class="keywordflow">if</span> x[5] = -10 then 50 <span class="keywordflow">else</span> 0;
</pre></div> and store that in the my_schema.my_train_data table as follows: <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_generate_reg_data(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, 1000, 5);
</pre></div></li>
<li>We can now learn a regression model and store the resultant model under the name 'myexp'. <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_regression(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, <span class="stringliteral">&#39;myexp&#39;</span>, <span class="keyword">false</span>, <span class="stringliteral">&#39;MADlib.svm_dot&#39;</span>);
</pre></div></li>
<li>We can now start using it to predict the labels of new data points like as follows: <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_predict(<span class="stringliteral">&#39;myexp&#39;</span>, <span class="stringliteral">&#39;{1,2,4,20,10}&#39;</span>);
sql&gt; select MADlib.svm_predict(<span class="stringliteral">&#39;myexp&#39;</span>, <span class="stringliteral">&#39;{1,2,4,20,-10}&#39;</span>);
</pre></div></li>
<li>To learn multiple support vector models, we replace the learning step above by <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_regression(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, <span class="stringliteral">&#39;myexp&#39;</span>, <span class="keyword">true</span>, <span class="stringliteral">&#39;MADlib.svm_dot&#39;</span>);
</pre></div> The resultant models can be used for prediction as follows: <div class="fragment"><pre class="fragment">sql&gt; select * from MADlib.svm_predict_combo(<span class="stringliteral">&#39;myexp&#39;</span>, <span class="stringliteral">&#39;{1,2,4,20,10}&#39;</span>);
</pre></div></li>
<li>We can also predict the labels of all the data points stored in a table. For example, we can execute the following: <div class="fragment"><pre class="fragment">sql&gt; create table MADlib.svm_reg_test ( <span class="keywordtype">id</span> <span class="keywordtype">int</span>, ind float8[] );
sql&gt; insert into MADlib.svm_reg_test (select <span class="keywordtype">id</span>, ind from my_schema.my_train_data limit 20);
sql&gt; select MADlib.svm_predict_batch(<span class="stringliteral">&#39;MADlib.svm_reg_test&#39;</span>, <span class="stringliteral">&#39;ind&#39;</span>, <span class="stringliteral">&#39;id&#39;</span>, <span class="stringliteral">&#39;myexp&#39;</span>, <span class="stringliteral">&#39;MADlib.svm_reg_output1&#39;</span>, <span class="keyword">false</span>);
sql&gt; select * from MADlib.svm_reg_output1;
sql&gt; select MADlib.svm_predict_batch(<span class="stringliteral">&#39;MADlib.svm_reg_test&#39;</span>, <span class="stringliteral">&#39;ind&#39;</span>, <span class="stringliteral">&#39;id, &#39;</span>myexp<span class="stringliteral">&#39;, &#39;</span>MADlib.svm_reg_output2<span class="stringliteral">&#39;, true);</span>
<span class="stringliteral">sql&gt; select * from MADlib.svm_reg_output2;</span>
</pre></div></li>
</ol>
<p><b>Example usage for classification:</b></p>
<ol type="1">
<li>We can randomly generate 2000 5-dimensional data labelled by the simple target function <div class="fragment"><pre class="fragment">t(x) = <span class="keywordflow">if</span> x[1] &gt; 0 and x[2] &lt; 0 then 1 <span class="keywordflow">else</span> -1;
</pre></div> and store that in the my_schema.my_train_data table as follows: <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_generate_cls_data(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, 2000, 5);
</pre></div></li>
<li>We can now learn a classification model and store the resultant model under the name 'myexpc'. <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_classification(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, <span class="stringliteral">&#39;myexpc&#39;</span>, <span class="keyword">false</span>, <span class="stringliteral">&#39;MADlib.svm_dot&#39;</span>);
</pre></div></li>
<li>We can now start using it to predict the labels of new data points like as follows: <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_predict(<span class="stringliteral">&#39;myexpc&#39;</span>, <span class="stringliteral">&#39;{10,-2,4,20,10}&#39;</span>);
</pre></div></li>
<li>To learn multiple support vector models, replace the model-building and prediction steps above by <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_classification(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, <span class="stringliteral">&#39;myexpc&#39;</span>, <span class="keyword">true</span>, <span class="stringliteral">&#39;MADlib.svm_dot&#39;</span>);
sql&gt; select * from MADlib.svm_predict_combo(<span class="stringliteral">&#39;myexpc&#39;</span>, <span class="stringliteral">&#39;{10,-2,4,20,10}&#39;</span>);
</pre></div></li>
<li>To learn a linear support vector model using SGD, replace the model-building and prediction steps above by <div class="fragment"><pre class="fragment">sql&gt; select MADlib.lsvm_classification(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, <span class="stringliteral">&#39;myexpc&#39;</span>, <span class="keyword">false</span>);
sql&gt; select MADlib.lsvm_predict(<span class="stringliteral">&#39;myexpc&#39;</span>, <span class="stringliteral">&#39;{10,-2,4,20,10}&#39;</span>);
</pre></div></li>
<li>To learn multiple linear support vector models using SGD, replace the model-building and prediction steps above by <div class="fragment"><pre class="fragment">sql&gt; select MADlib.lsvm_classification(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, <span class="stringliteral">&#39;myexpc&#39;</span>, <span class="keyword">true</span>);
sql&gt; select MADlib.lsvm_predict_combo(<span class="stringliteral">&#39;myexpc&#39;</span>, <span class="stringliteral">&#39;{10,-2,4,20,10}&#39;</span>);
</pre></div></li>
</ol>
<p><b>Example usage for novelty detection:</b></p>
<ol type="1">
<li>We can randomly generate 100 2-dimensional data (the normal cases) and store that in the my_schema.my_train_data table as follows: <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_generate_nd_data(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, 100, 2);
</pre></div></li>
<li>Learning and predicting using a single novelty detection model can be done as follows: <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_novelty_detection(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, <span class="stringliteral">&#39;myexpnd&#39;</span>, <span class="keyword">false</span>, <span class="stringliteral">&#39;MADlib.svm_dot&#39;</span>);
sql&gt; select MADlib.svm_predict(<span class="stringliteral">&#39;myexpnd&#39;</span>, <span class="stringliteral">&#39;{10,-10}&#39;</span>);
sql&gt; select MADlib.svm_predict(<span class="stringliteral">&#39;myexpnd&#39;</span>, <span class="stringliteral">&#39;{-1,-1}&#39;</span>);
</pre></div></li>
<li>Learning and predicting using multiple models can be done as follows: <div class="fragment"><pre class="fragment">sql&gt; select MADlib.svm_novelty_detection(<span class="stringliteral">&#39;my_schema.my_train_data&#39;</span>, <span class="stringliteral">&#39;myexpnd&#39;</span>, <span class="keyword">true</span>, <span class="stringliteral">&#39;MADlib.svm_dot&#39;</span>);
sql&gt; select * from MADlib.svm_predict_combo(<span class="stringliteral">&#39;myexpnd&#39;</span>, <span class="stringliteral">&#39;{10,-10}&#39;</span>);
sql&gt; select * from MADlib.svm_predict_combo(<span class="stringliteral">&#39;myexpnd&#39;</span>, <span class="stringliteral">&#39;{-1,-1}&#39;</span>);
</pre></div></li>
</ol>
<dl class="user"><dt><b>Literature:</b></dt><dd></dd></dl>
<p>[1] Jyrki Kivinen, Alexander J. Smola, and Robert C. Williamson: <em>Online Learning with Kernels</em>, IEEE Transactions on Signal Processing, 52(8), 2165-2176, 2004.</p>
<p>[2] Bernhard Scholkopf and Alexander J. Smola: <em>Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond</em>, MIT Press, 2002.</p>
<p>[3] L&eacute;on Bottou: <em>Large-Scale Machine Learning with Stochastic Gradient Descent</em>, Proceedings of the 19th International Conference on Computational Statistics, Springer, 2010.</p>
<dl class="see"><dt><b>See also:</b></dt><dd>File <a class="el" href="online__sv_8sql__in.html" title="SQL functions for support vector machines.">online_sv.sql_in</a> documenting the SQL functions. </dd></dl>
</div>
</div>
<div id="nav-path" class="navpath">
<ul>
<!-- window showing the filter options -->
<div id="MSearchSelectWindow"
onmouseover="return searchBox.OnSearchSelectShow()"
onmouseout="return searchBox.OnSearchSelectHide()"
onkeydown="return searchBox.OnSearchSelectKey(event)">
<a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(0)"><span class="SelectionMark">&#160;</span>All</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(1)"><span class="SelectionMark">&#160;</span>Files</a><a class="SelectItem" href="javascript:void(0)" onclick="searchBox.OnSelectItem(2)"><span class="SelectionMark">&#160;</span>Functions</a></div>
<!-- iframe showing the search results (closed by default) -->
<div id="MSearchResultsWindow">
<iframe src="javascript:void(0)" frameborder="0"
name="MSearchResults" id="MSearchResults">
</iframe>
</div>
<li class="footer">Generated on Tue Apr 2 2013 14:57:03 for MADlib by
<a href="http://www.doxygen.org/index.html">
<img class="footer" src="doxygen.png" alt="doxygen"/></a> 1.7.5.1 </li>
</ul>
</div>
</body>
</html>