blob: 3903a5db0a67076e844cfec43fc57c8fdb40b31a [file] [log] [blame]
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<document>
<properties>
<title>Testing</title>
<author email="dev@commons.apache.org">Apache Commons Developers</author>
</properties>
<body>
<section name="VFS Test Suite">
<p>
Apache Commons VFS comes with a suite of (nearly 2000) tests (in <code>core/src/test</code>). The JUnit framework
is used, and executed at build time via the Maven
<a href="http://maven.apache.org/surefire/maven-surefire-plugin/">Surefire plugin</a> by the <code>mvn test</code> goal.
It you plan to contribute a patch for a bug or feature, make sure to also provide a test
which can reproduce the bug or exercise the new feature. Also run the whole test suite against the patched code.
</p>
<p>
The <a href="http://junit.org">JUnit</a> tests will execute unit, compile but also integration tests to test the API and the implementation.
The local file provider is tested in a directory of the local file system. Virtual providers (compression and archive)
and resource access is based on this test directory as well. For testing the other providers some test servers are started.
The following table described the details (for versions have a look in the
<a href="commons-vfs2/dependencies.html#test">dependency report</a>):
</p>
<table><tr><th>Provider</th><th>Tested Against</th><th>External</th></tr>
<tr><td>ftp</td><td><a href="http://mina.apache.org/ftpserver-project/">Apache FtpServer</a></td><td>-Pftp -Dtest.ftp.uri=ftp://test:test@localhost:123</td></tr>
<tr><td>ftps</td><td><a href="http://mina.apache.org/ftpserver-project/">Apache FtpServer</a></td><td>-Pftps -Dtest.ftps.uri=ftps://test:test@localhost:123</td></tr>
<tr><td>hdfs</td><td>Apache Hadoop HDFS (<a href="https://wiki.apache.org/hadoop/HowToDevelopUnitTests">MiniDFSCluster</a>)</td><td>-P!no-test-hdfs (see below)</td></tr>
<tr><td>http</td><td>NHttpServer (local adaption of org.apache.http.examples.nio.NHttpServer)</td><td>-Phttp -Dtest.http.uri=http://localhost:123</td></tr>
<tr><td>https</td><td>(not tested)</td><td>N/A</td></tr>
<tr><td>jar</td><td>Local File Provider</td><td>N/A</td></tr>
<tr><td>local</td><td>Local File system</td><td>N/A</td></tr>
<tr><td>ram</td><td>In Memory test</td><td>N/A</td></tr>
<tr><td>res</td><td>Local File Provider / JAR Provider</td><td>N/A</td></tr>
<tr><td>sftp</td><td><a href="http://mina.apache.org/sshd-project/index.html">Apache SSHD</a></td><td>-Psftp -Dtest.sftp.uri=sftp://testtest@localhost:123</td></tr>
<tr><td>tmp</td><td>Local File system</td><td>N/A</td></tr>
<tr><td>url</td><td>NHttpServer (local adaption of org.apache.http.examples.nio.NHttpServer)<br/>Local File system</td><td>-Phttp -Dtest.http.uri=http://localhost:128</td></tr>
<tr><td>webdav</td><td><a href="http://jackrabbit.apache.org/standalone-server.html">Apache Jackrabbit Standalone Server</a></td><td>-Pwebdav -Dtest.webdav.uri=webdav://admin@localhost:123/repository/default</td></tr>
<!-- <tr><td>webdavs</td><td>Apache Jackrabbit Standalone</td><td>-Pwebdav -Dtest.webdav.uri=webdav://admin@localhost:123/repository/default</td></tr> -->
<tr><td>zip</td><td>Local File Provider</td><td>N/A</td></tr>
<tr><td>smb (sandbox)</td><td>(not tested)</td><td>-Psmb -Dtest.smb.uri=smb://DOMAIN\User:Pass@host/C$/commons-vfs2/core/target/test-classes/test-data</td></tr>
</table>
<p>
Some tests are operating-system specific. Some Windows File Name tests are only run on Windows
and the HDFS test is skipped in case of Windows (because it requires additional binaries). It is therefore
a good idea to run the tests at least on Windows and Linux/Unix before release. The <code>smb</code> provider
from the sandbox is not tested unless you specify a <code>-Dtest.smb.uri</code> and the <code>-Psmb</code> profile.
</p>
</section>
<section name="Running HDFS tests on Windows">
<p>
The HDFS integration tests use the HDFS MiniCluster. This does not work on Windows without special preparation:
you need to build and provide the (2.6.0) native binary (<code>winutils.exe</code>) and library (<code>hadoop.dll</code>) for the
MiniCluster used in the test cases. Both files are not part of the Hadoop Commons 2.6.0
distribution (<a href="https://issues.apache.org/jira/browse/HADOOP-10051">HADOOP-10051</a>). After you built
a compatible version, put them on your Windows <code>PATH</code> and then run the tests
by disabling the <code>no-test-hdfs</code> profile, or by requesting explicitly the excluded tests:
</p>
<source><![CDATA[
> set VFS=C:\commons-vfs2-project\core
> cd %VFS%\core
> mkdir bin\
> copy \temp\winutils.exe \temp\hadoop.dll bin\
> set HADOOP_HOME=%VFS%\core
> set PATH=%VFS%\core\bin;%PATH%
> winutils.exe systeminfo
8518668288,8520572928,4102033408,4544245760,8,1600000,6042074
> mvn -P!no-test-hdfs clean test # runs all test and HDFS tests
> mvn clean test -Dtest=org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTest,org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTestCase
...
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.006 sec - in org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTest
Tests run: 77, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.728 sec - in org.apache.commons.vfs2.provider.hdfs.test.HdfsFileProviderTestCase]]></source>
</section>
<section name="Running SMB tests against Windows">
<p>
The SMB provider from the sandbox project cannot be tested automatically. You need to prepare a CIFS/SMB server
to test it manually. If you develop on Windows, the following procedure uses the Windows File Sharing and does
not require to prepare the data directory (as you can directly point to your workspace):
</p>
<source><![CDATA[
> set VFS=C:\commons-vfs2-project
> cd %VFS%
> mvn clean install -Pinclude-sandbox -DskipTests # prepares test data and parent
> cd %VFS%\sandbox
> mvn test -Psmb -Dtest.smb.url=smb//Domain\User:Pass@yourhost/C$/commons-vfs2-project/core/target/test-classes/test-data
...
Tests run: 82, Failures: 0, Errors: 1, Skipped: 0]]></source>
<p>
Note: there is a known test failure in this case, see
<a href="https://issues.apache.org/jira/browse/VFS-562">VFS-562</a> on the JIRA bug tracker if you want
to help.
</p>
</section>
<section name="Running tests with external servers">
<p>
In order to test VFS for compatibility with other implementations (or in case of SMB to
test it manually) some of the integration tests can be configured to connect to custom URL.
This generally involves preparing the server, selecting a profile and specifying the URL in
a system property (see table above).
</p>
<subsection name="Preparing external Servers">
<p>
If you want to run the tests against external servers, run <code>mvn install</code>.
This will compile all the source and test source and then run all the tests
for providers that use the local file system.
After running the maven build, the test data can be found in
<code>core/target/test-classes/test-data/</code>.
</p>
<p>
Each repository/server should contain the following list of files for the tests to
complete successfully.
</p>
<source><![CDATA[
code/sealed/AnotherClass.class
code/ClassToLoad.class
largefile.tar.gz
nested.jar
nested.tar
nested.tbz2
nested.tgz
nested.zip
read-tests/dir1/file1.txt
read-tests/dir1/file2.txt
read-tests/dir1/file3.txt
read-tests/dir1/subdir1/file1.txt
read-tests/dir1/subdir1/file2.txt
read-tests/dir1/subdir1/file3.txt
read-tests/dir1/subdir2/file1.txt
read-tests/dir1/subdir2/file2.txt
read-tests/dir1/subdir2/file3.txt
read-tests/dir1/subdir3/file1.txt
read-tests/dir1/subdir3/file2.txt
read-tests/dir1/subdir3/file3.txt
read-tests/empty.txt
read-tests/file1.txt
read-tests/file space.txt
read-tests/file%.txt
test-hash-#test.txt
test.jar
test.mf
test.policy
test.tar
test.tbz2
test.tgz
test.zip
write-tests/]]></source>
<p>
The Apache Commons Wiki contains a list of configuration examples for external servers.
Please consider contributing if you have set up a specific scenario:
<a href="https://wiki.apache.org/commons/VfsTestServers">https://wiki.apache.org/commons/VfsTestServers</a>.
</p>
</subsection>
</section>
</body>
</document>