Java base client probe for Probe Dock.
- Java 6+
In your pom file:
<dependency>
<groupId>io.probedock.client</groupId>
<artifactId>probedock-java</artifactId>
<version>0.4.1</version>
</dependency>
The probes based on this library have the possibility to configure the category based on Package pattern matching.
...
java:
categoriesByPackage:
io.probedock.integration**: Integration
io.probedock.api: API
io.probedock.e2e.*: End to end
...
Once a pattern is matching the testing class package, the category is used. The class
annotation has the precedence on this behavior. The method
annotation has the precedence on the category define in the class
annotation. In fact, the behavior to get the category is the following:
- From the
method
annotation - From the
class
annotation - From the package pattern matching
- From the configuration file 1. From the project configuration 2. From the main configuration
- The default category hardcoded in the probe
The package pattern matching is done through a port of minimatch. The /
is replaced by .
when you specify the pattern in the configuration file. It is not possible to specify an extension and it does not make any sens for that as packages are only folders in fact.
The connector class is responsible to send the results to Probe Dock.
There is an example to send results to Probe Dock.
/*
* Create the connector with the singleton instance of the configuration.
* The configuration instance takes care to load the main configuration file in ~/.probedock/config.yml and
* also the configuration file probedock.yml present in the project.
*/
Connector connector = new Connector(Configuration.getInstance());
/*
* The result construction will be discussed later.
*/
TestRun testRun = ...;
try {
/*
* That's it ! The call to send will handle everything for you to send the results
* to Probe Dock. It takes all the connection data from the configuration.
*/
connector.send(testRun);
}
catch (MalformedUrlException mue) {
// Do something with the exception
}
The class ModelFactory
is available to create the various models to create the payload sent to Probe Dock. There is
some examples. The examples are the right sequence.
/*
* Create a context which contains various information about the Java runtime environment like the VM version, Java
* version and memory. This is recommended to create the context before the first test is run as it will also
* take the data about the memory.
*/
Context context = ModelFactory.createContext();
/*
* There is a dedicated object to know more about your Probe. It is not mandatory but highly recommended.
*/
Probe probe = ModelFactory.createProbe("Junit", "1.2.3");
/*
* Before collecting the first test result, you will need to create the test run to store the test results.
*/
TestRun testRun = ModelFactory.createTestRun(
Configuration.getInstance(), // The configuration singleton (legacy reason)
context, // The context with the various Java related data
probe, // The data about your probe
"adkekciakdk", // The project identifier given by Probe Dock and generaly retrieved from the configuration
"1.2.3", // The version of the project
null, // Pipeline is not used yet
null, // Stage is not used yet
null, // List of test reports (deprecated, legacy),
new HashMap<String, String>() // A map containing metadata about the test run
);
/*
* Create a fingerprint based on the class where the test is defined and the method
* which is the test.
*/
String fingerPrint = ModelFactory.createFingerPrint(testClass, testMethod);
/*
* Now, we are ready to collect the test results. So each time a test result is received, we need to create a test
* result object.
*/
TestResult testResult = ModelFactory.createTestResult(
"abcd", // The key generated by Probe Dock to identify the test
fingerPrint, // Identifier generated by the Probe to identify the test
"The humanized test name", // The test name which should be human friendly
"Junit", // The test category. The Probe should define a default one
12L, // The execution duration
"The test message which is usually a stack trace", // The message represented the test result (stack trace, ...)
true, // Test failed/passed
true, // Test active/inactive.
new HashSet<String>(), // A list of contributor emails
new HashSet<String>(), // A list of developer defined tags about the test
new HashSet<String>(), // A list of tickets (JIRA, ...)
new HashMap<String, String>() // A map of meta data about the test (e.g. java class, method, package)
);
/*
* You probably want to add the package, class and method names. There is an helper method for that
*/
ModelFactory.enrichTestResult(testResult, "io.probedock.whatever", "SomeClass", "someMethod");
/*
* Now we can add the test result to the test run. We have to repeat that for each test collected.
*/
testRun.getTestResults().add(testResult);
/*
* Once the last test finished to run, it is useful to enrich the context with additional data. In fact,
* it will add the memory state after all the tests execution.
*/
ModelFactory.enrichContext(context);
// Once you have collected all the results, you can send them through the Connector
To send results to Probe Dock, you need:
- Create the testing
Context
- Create the
Probe
data - Create the
TestRun
to collect the tests - Collect each test result
- For each test result collected
- Generate the
fingerprint
- Create the
TestResult
- Enrich the
TestResult
- Add the
TestResult
to theTestRun
- Finally, once all the tests were executed, you can enrich the
Context
with additional data - And then send the results to Probe Dock
- Fork
- Create a topic branch -
git checkout -b my_feature
- Push to your branch -
git push origin my_feature
- Create a pull request from your branch
Please add a changelog entry with your name for new features and bug fixes.
Probe Dock Java is licensed under the MIT License. See LICENSE.txt for the full license.