HCATALOG-393 Several hcat 0.4 document fixes


git-svn-id: https://svn.apache.org/repos/asf/incubator/hcatalog/branches/branch-0.4.0-rc4@1332463 13f79535-47bb-0310-9956-ffa450edef68
diff --git a/CHANGES.txt b/CHANGES.txt
index 1050396..53d1ecf 100644
--- a/CHANGES.txt
+++ b/CHANGES.txt
@@ -108,6 +108,8 @@
   OPTIMIZATIONS
 
   BUG FIXES
+  HCAT-393 Several hcat 0.4 document fixes (daijy via gates)
+
   HCAT-395 Hcat 0.4 last minute doc fixes (gates)
 
   HCAT-394 HCatalog 0.4 should pull hive-0.9.0 from maven instead of 0.9.0-SNAPSHOT (gates)
diff --git a/src/docs/src/documentation/content/xdocs/cli.xml b/src/docs/src/documentation/content/xdocs/cli.xml
index 81b8ede..bc08571 100644
--- a/src/docs/src/documentation/content/xdocs/cli.xml
+++ b/src/docs/src/documentation/content/xdocs/cli.xml
@@ -27,7 +27,7 @@
 <section>
 	<title>Set Up</title>
 <p>The HCatalog command line interface (CLI) can be invoked as
-<code>HIVE_HOME=</code><em>hive_home hcat_home</em><code>bin/hcat</code>
+<code>HIVE_HOME=</code><em>hive_home hcat_home</em><code>/bin/hcat</code>
 where <em>hive_home</em> is the directory where Hive has been installed and
 <em>hcat_home</em> is the directory where HCatalog has been installed.</p>
 
@@ -82,6 +82,7 @@
      <li>ALTER TABLE ... REBUILD</li> 
      <li>ALTER TABLE ... CONCATENATE</li>
      <li>ANALYZE TABLE ... COMPUTE STATISTICS</li>
+     <li>ALTER TABLE ARCHIVE/UNARCHIVE PARTITION</li>
    </ul>
 
 <section>
@@ -161,6 +162,10 @@
 	
 	<!-- ==================================================================== -->
 <section>
+	<title>"dfs" command and "set" command</title>
+	<p>Supported. Behavior same as Hive.</p>
+</section>
+<section>
 	<title>Other Commands</title>
 	<p>Any command not listed above is NOT supported and throws an exception with the message "Operation Not Supported". </p>
 </section>
diff --git a/src/docs/src/documentation/content/xdocs/index.xml b/src/docs/src/documentation/content/xdocs/index.xml
index 7afc863..791c6b3 100644
--- a/src/docs/src/documentation/content/xdocs/index.xml
+++ b/src/docs/src/documentation/content/xdocs/index.xml
@@ -45,7 +45,7 @@
 <title>Interfaces</title>   
 <p>The HCatalog interface for Pig – HCatLoader and HCatStorer – is an implementation of the Pig load and store interfaces. HCatLoader accepts a table to read data from; you can indicate which partitions to scan by immediately following the load statement with a partition filter statement. HCatStorer accepts a table to write to and optionally a specification of partition keys to create a new partition. You can write to a single partition by specifying the partition key(s) and value(s) in the STORE clause; and you can write to multiple partitions if the partition key(s) are columns in the data being stored. HCatLoader and HCatStorer are implemented on top of HCatInputFormat and HCatOutputFormat, respectively (see <a href="loadstore.html">HCatalog Load and Store</a>).</p>
 
-<p>The HCatalog interface for MapReduce – HCatInputFormat and HCatOutputFormat – is an implementation of Hadoop InputFormat and OutputFormat. HCatInputFormat accepts a table to read data from and optionally a selection predicate to indicate which partitions to scan. HCatOutputFormat accepts a table to write to and optionally a specification of partition keys to create a new partition. You can write to a single partition by specifying the partition key(s) and value(s) in the STORE clause; and you can write to multiple partitions if the partition key(s) are columns in the data being stored. (See <a href="inputoutput.html">HCatalog Input and Output</a>.)</p>
+<p>The HCatalog interface for MapReduce – HCatInputFormat and HCatOutputFormat – is an implementation of Hadoop InputFormat and OutputFormat. HCatInputFormat accepts a table to read data from and optionally a selection predicate to indicate which partitions to scan. HCatOutputFormat accepts a table to write to and optionally a specification of partition keys to create a new partition. You can write to a single partition by specifying the partition key(s) and value(s) in the setOutput method; and you can write to multiple partitions if the partition key(s) are columns in the data being stored. (See <a href="inputoutput.html">HCatalog Input and Output</a>.)</p>
 
 <p>Note: There is no Hive-specific interface. Since HCatalog uses Hive's metastore, Hive can read data in HCatalog directly.</p>
 
@@ -82,7 +82,7 @@
 
 <p>With HCatalog, HCatalog will send a JMS message that data is available. The Pig job can then be started.</p>
 <source>
-A = load 'rawevents' using HCatLoader;
+A = load 'rawevents' using HCatLoader();
 B = filter A by date = '20100819' and by bot_finder(zeta) = 0;

 store Z into 'processedevents' using HCatStorer("date=20100819");
diff --git a/src/docs/src/documentation/content/xdocs/inputoutput.xml b/src/docs/src/documentation/content/xdocs/inputoutput.xml
index decfffe..a3c40d3 100644
--- a/src/docs/src/documentation/content/xdocs/inputoutput.xml
+++ b/src/docs/src/documentation/content/xdocs/inputoutput.xml
@@ -149,28 +149,21 @@
 export HADOOP_HOME=&lt;path_to_hadoop_install&gt;
 export HCAT_HOME=&lt;path_to_hcat_install&gt;
 export LIB_JARS=$HCAT_HOME/share/hcatalog/hcatalog-0.4.0.jar,
-$HCAT_HOME/share/hcatalog/lib/hive-metastore-0.8.1.jar,$HCAT_HOME/share/hcatalog/lib/libthrift-0.7.0.jar,
-$HCAT_HOME/share/hcatalog/lib/hive-exec-0.8.1.jar,$HCAT_HOME/share/hcatalog/lib/libfb303-0.7.0.jar,
-$HCAT_HOME/share/hcatalog/lib/jdo2-api-2.3-ec.jar,$HCAT_HOME/share/hcatalog/lib/slf4j-api-1.6.1.jar,
-$HCAT_HOME/share/hcatalog/lib/antlr-runtime-3.0.1.jar,
-$HCAT_HOME/share/hcatalog/lib/datanucleus-connectionpool-2.0.3.jar,
-$HCAT_HOME/share/hcatalog/lib/datanucleus-core-2.0.3.jar,
-$HCAT_HOME/share/hcatalog/lib/datanucleus-enhancer-2.0.3.jar,
-$HCAT_HOME/share/hcatalog/lib/datanucleus-rdbms-2.0.3.jar,
-$HCAT_HOME/share/hcatalog/lib/commons-dbcp-1.4.jar,
-$HCAT_HOME/share/hcatalog/lib/commons-pool-1.5.4.jar
+$HIVE_HOME/lib/hive-metastore-0.9.0.jar,
+$HIVE_HOME/lib/libthrift-0.7.0.jar,
+$HIVE_HOME/lib/hive-exec-0.9.0.jar,
+$HIVE_HOME/lib/libfb303-0.7.0.jar,
+$HIVE_HOME/lib/jdo2-api-2.3-ec.jar,
+$HIVE_HOME/lib/slf4j-api-1.6.1.jar
+
 export HADOOP_CLASSPATH=$HCAT_HOME/share/hcatalog/hcatalog-0.4.0.jar:
-$HCAT_HOME/share/hcatalog/lib/hive-metastore-0.8.1.jar:$HCAT_HOME/share/hcatalog/lib/libthrift-0.7.0.jar:
-$HCAT_HOME/share/hcatalog/lib/hive-exec-0.8.1.jar:$HCAT_HOME/share/hcatalog/lib/libfb303-0.7.0.jar:
-$HCAT_HOME/share/hcatalog/lib/jdo2-api-2.3-ec.jar:$HCAT_HOME/share/hcatalog/lib/slf4j-api-1.6.1.jar:
-$HCAT_HOME/share/hcatalog/lib/antlr-runtime-3.0.1.jar:
-$HCAT_HOME/share/hcatalog/lib/datanucleus-connectionpool-2.0.3.jar:
-$HCAT_HOME/share/hcatalog/lib/datanucleus-core-2.0.3.jar:
-$HCAT_HOME/share/hcatalog/lib/datanucleus-enhancer-2.0.3.jar:
-$HCAT_HOME/share/hcatalog/lib/datanucleus-rdbms-2.0.3.jar:
-$HCAT_HOME/share/hcatalog/lib/commons-dbcp-1.4.jar:
-$HCAT_HOME/share/hcatalog/lib/commons-pool-1.5.4.jar:
-$HCAT_HOME/etc/hcatalog
+$HIVE_HOME/lib/hive-metastore-0.9.0.jar:
+$HIVE_HOME/lib/libthrift-0.7.0.jar:
+$HIVE_HOME/lib/hive-exec-0.9.0.jar:
+$HIVE_HOME/lib/libfb303-0.7.0.jar:
+$HIVE_HOME/lib/jdo2-api-2.3-ec.jar:
+$HIVE_HOME/conf:$HADOOP_HOME/conf:
+$HIVE_HOME/lib/slf4j-api-1.6.1.jar
 
 $HADOOP_HOME/bin/hadoop --config $HADOOP_HOME/conf jar &lt;path_to_jar&gt;
 &lt;main_class&gt; -libjars $LIB_JARS &lt;program_arguments&gt;
diff --git a/src/docs/src/documentation/content/xdocs/install.xml b/src/docs/src/documentation/content/xdocs/install.xml
index a119fa6..e841a00 100644
--- a/src/docs/src/documentation/content/xdocs/install.xml
+++ b/src/docs/src/documentation/content/xdocs/install.xml
@@ -113,7 +113,7 @@
     where you have installed Hive.  If you are using Hive rpms, then this will
     be <code>/usr/lib/hive</code>.</p>
 
-    <p><code>mysql -u hive -D hivemetastoredb -h</code><em>hivedb.acme.com</em><code> -p &lt; </code><em>hive_home</em><code>scripts/metastore/upgrade/mysql/hive-schema-0.9.0.mysql.sql</code></p>
+    <p><code>mysql -u hive -D hivemetastoredb -h</code><em>hivedb.acme.com</em><code> -p &lt; </code><em>hive_home</em><code>/scripts/metastore/upgrade/mysql/hive-schema-0.9.0.mysql.sql</code></p>
 
     <p><strong>Thrift Server Setup</strong></p>
 
diff --git a/src/docs/src/documentation/content/xdocs/loadstore.xml b/src/docs/src/documentation/content/xdocs/loadstore.xml
index 9998aae..347bde6 100644
--- a/src/docs/src/documentation/content/xdocs/loadstore.xml
+++ b/src/docs/src/documentation/content/xdocs/loadstore.xml
@@ -115,19 +115,15 @@
 <source>
 export HADOOP_HOME=&lt;path_to_hadoop_install&gt;
 export HCAT_HOME=&lt;path_to_hcat_install&gt;
-PIG_CLASSPATH=$HCAT_HOME/share/hcatalog/hcatalog-0.4.0.jar:$HCAT_HOME/share/hcatalog/lib/
-hive-metastore-0.8.1.jar:$HCAT_HOME/share/hcatalog/lib/libthrift-0.7.0.jar:$HCAT_HOME/
-share/hcatalog/lib/hive-exec-0.8.1.jar:$HCAT_HOME/share/hcatalog/lib/libfb303-0.7.0.jar:
-$HCAT_HOME/share/hcatalog/lib/jdo2-api-2.3-ec.jar:$HCAT_HOME/etc/hcatalog:$HADOOP_HOME/
-conf:$HCAT_HOME/share/hcatalog/lib/slf4j-api-1.6.1.jar
+export PIG_CLASSPATH=$HCAT_HOME/share/hcatalog/hcatalog-0.4.0.jar:$HIVE_HOME/lib/hive-metastore-0.9.0.jar:
+$HIVE_HOME/lib/libthrift-0.7.0.jar:$HIVE_HOME/lib/hive-exec-0.9.0.jar:$HIVE_HOME/lib/libfb303-0.7.0.jar:
+$HIVE_HOME/lib/jdo2-api-2.3-ec.jar:$HIVE_HOME/conf:$HADOOP_HOME/conf:$HIVE_HOME/lib/slf4j-api-1.6.1.jar
+
 export PIG_OPTS=-Dhive.metastore.uris=thrift://&lt;hostname&gt;:&lt;port&gt;
 
-&lt;path_to_pig_install&gt;/bin/pig -Dpig.additional.jars=$HCAT_HOME/share/hcatalog/
-hcatalog-0.4.0.jar:$HCAT_HOME/share/hcatalog/lib/hive-metastore-0.8.1.jar:$HCAT_HOME/
-share/hcatalog/lib/libthrift-0.7.0.jar:$HCAT_HOME/share/hcatalog/lib/hive-exec-0.8.1.jar:
-$HCAT_HOME/share/hcatalog/lib/libfb303-0.7.0.jar:$HCAT_HOME/share/hcatalog/lib/jdo2-
-api-2.3-ec.jar:$HCAT_HOME/etc/hcatalog:$HCAT_HOME/share/hcatalog/lib/slf4j-api-1.6.1.jar
- &lt;script.pig&gt;
+&lt;path_to_pig_install&gt;/bin/pig -Dpig.additional.jars=$HCAT_HOME/share/hcatalog/hcatalog-0.4.0.jar:
+$HIVE_HOME/lib/hive-metastore-0.9.0.jar:$HIVE_HOME/lib/libthrift-0.7.0.jar:$HIVE_HOME/lib/hive-exec-0.9.0.jar:
+$HIVE_HOME/lib/libfb303-0.7.0.jar:$HIVE_HOME/lib/jdo2-api-2.3-ec.jar:$HIVE_HOME/lib/slf4j-api-1.6.1.jar &lt;script.pig&gt;
 </source>
 
 <p><strong>Authentication</strong></p>