Once, when preparing one of the reports about plugin development for Gradle, there was a task - how to test our products. Without tests in general, life is bad, and when your code actually runs in a separate process, and even more so, because you want debug, you want to start quickly and don’t want to write a million examples to test all possible cases. Under the cut a comparison of several testing methods that we managed to try.
Our guinea pig will be a project that tolkkv and I have been preparing for the 2016 JPoint conference. In short, we wrote a plugin that will collect documentation from various projects and generate a regular html document with cross-reference links. But it's not about how we wrote the plugin itself (although it was also fun and exciting), but how to test what you write. I repent, but almost the entire project we tested integrationally, through examples. And at some point, they realized that it was worth thinking about another method of testing. So, our candidates:
The challenge is the same everywhere. Just check that our documentation plugin is connected, and there is a task that can execute successfully. They drove.
Now it is in the incubation stage, which was very noticeable when we tried to fasten it. If we take an example from the documentation and naively apply it to our realities (see the example below), it will not work. Let's understand what we did.
@Slf4j class TestSpecification extends Specification { @Rule final TemporaryFolder testProjectDir = new TemporaryFolder() def buildFile def setup() { buildFile = testProjectDir.newFile('build.gradle') } def "execution of documentation distribution task is up to date"() { given: buildFile << """ buildscript { repositories { jcenter() } dependencies { classpath 'org.asciidoctor:asciidoctor-gradle-plugin:1.5.3' } } apply plugin: 'org.asciidoctor.convert' apply plugin: 'ru.jpoint.documentation' docs { debug = true } dependencies { asciidoctor 'org.asciidoctor:asciidoctorj:1.5.4' docs 'org.slf4j:slf4j-api:1.7.2' } """ when: def result = GradleRunner.create() .withProjectDir(testProjectDir.root) .withArguments('documentationDistZip') .build() then: result.task(":documentationDistZip").outcome == TaskOutcome.UP_TO_DATE } }
We use Spock, although you can use JUnit. Our project will lie and run in a temporary folder, which is determined via testProjectDir
. In the setup method, we create a new project assembly file. In a given, we defined the content of this file, connected the necessary plugins and dependencies to it. In the when section through the GradleRunner
new class, we transfer the previously defined directory with the project and say that we want to launch the task from the plugin. In the then section, we check that we have the task, but since we have not determined any documents, we do not need to execute it.
So here, running the test, we learn that the test framework does not know what ru.jpoint.documentation
plugin - ru.jpoint.documentation
- we connected. Why it happens? Because now GradleRunner
does not pass the plug-in classpath inside. And this greatly limits our testing. We go to the documentation and find out that there is a withPluginClasspath
method to which we can transfer the resources we need, and they will be picked up during the testing process. It remains to understand - how to form it.
If you think this is obvious, think again. To solve the problem, you need to separate the task through yourself (thanks to Gradle for the imperative approach) to create a text file with a set of resources in the build
directory. We write:
task createClasspathManifest { def outputDir = sourceSets.test.output.resourcesDir inputs.files sourceSets.main.runtimeClasspath outputs.dir outputDir doLast { outputDir.mkdirs() file("$outputDir/plugin-classpath.txt").text = sourceSets.main.runtimeClasspath.join("\n") } }
Run, get the file. Now we go to our test and in setup we add the following pleasant code for reading:
def pluginClasspathResource = getClass().classLoader.findResource("plugin-classpath.txt") if (pluginClasspathResource == null) { throw new IllegalStateException("Did not find plugin classpath resource, run `testClasses` build task.") } pluginClasspath = pluginClasspathResource.readLines() .collect { new File(it) }
Now pass the classpath
to GradleRunner
. Run, and nothing works. Go to the forums and find out that this only works with Gradle 2.8+. We check that we have 2.12 and sad. What to do? Let's try to do as advised to do for Gradle 2.7 and below. We will create another classpath
ourselves and add it directly to the buildscript
:
def classpathString = pluginClasspath .collect { it.absolutePath.replace('\\', '\\\\') } .collect { "'$it'" } .join(", ")
dependencies { classpath files($classpathString) ... }
We start - it works. This is not all problems. You can read the epic trad and it will be completely sad.
2.13 update: when we experimented, the new version has not yet been released. It fixed (finally) the problem with pulling up resources and now the code looks much more decent and noble. To do this, you need to connect the plugin in a slightly different way:
plugins { id 'ru.jpoint.documentation' }
and run GradleRunner
with an empty classpath:
def result = GradleRunner.create() .withProjectDir(testProjectDir.root) .withArguments('documentationDistZip') .withPluginClasspath() .build()
It remains only a disappointment that you cannot run this test from Idea through the context menu, because it does not know how to properly substitute the necessary resources. Through ./gradlew
everything works great.
Fix from d10xa : In the settings, select the “Gradle Test Runner” (Settings-> Build-> Build Tools-> Gradle-> Runner), delete the existing configuration and run the test again.
The bottom line: the direction is correct, but use sometimes hurts.
The second candidate showed himself much better. All you need to do is connect the plugin to your dependencies:
functionalTestCompile 'com.netflix.nebula:nebula-test:4.0.0'
Then in the specification, we can, by analogy with the previous example, create a build.gradle
file:
def setup() { buildFile << """ buildscript { repositories { jcenter() } dependencies { classpath 'org.asciidoctor:asciidoctor-gradle-plugin:1.5.3' } } apply plugin: 'org.asciidoctor.convert' apply plugin: info.developerblog.documentation.plugin.DocumentationPlugin docs { debug = true } dependencies { asciidoctor 'org.asciidoctor:asciidoctorj:1.5.4' docs 'org.slf4j:slf4j-api:1.7.2' } """ }
But the test itself looks easy, understandable, and most importantly - it starts without squats:
def "execution of documentation distribution task is success"() { when: createFile("/src/docs/asciidoc/documentation.adoc") ExecutionResult executionResult = runTasksSuccessfully('documentationDistZip') then: executionResult.wasExecuted('documentationDistZip') executionResult.getSuccess() }
In this example, we also created a file with the documentation, and therefore the result of the execution of our task will be SUCCESS
.
Bottom line: everything is very cool. Recommended for use.
Ok, all we did before is all such integration tests. Let's see what we can do through the mechanism of Unit-tests.
At first we will configure the project simply through the code:
def setup() { project = new ProjectBuilder().build() project.buildscript.repositories { jcenter() } project.buildscript.dependencies { classpath 'org.asciidoctor:asciidoctor-gradle-plugin:1.5.3' } project.plugins.apply('org.asciidoctor.convert') project.plugins.apply(DocumentationPlugin.class) project.dependencies { asciidoctor 'org.asciidoctor:asciidoctorj:1.5.4' docs 'org.slf4j:slf4j-api:1.7.2' } }
As you can see, this is practically no different from what we wrote earlier, only Closure
written slightly longer.
Now we can test that our task from the plugin actually appeared in the configured project (and the configuration was successful in general):
def "execution of documentation distribution task is success"() { when: project then: project.getTasksByName('documentationDistZip', true).size() == 1 }
But more than this we can not test. That is, through this method, we do not understand that the task will do what it is supposed to do, and, say, the document will actually be generated.
Bottom line: you can use to check the configuration of projects. This is faster than testing through actual execution. But our possibilities are very limited.
The use of the Nebula Test
for plugins testing is recommended. If you have a spreading logic when configuring a project, then it makes sense to look towards Unit-testing. Well, we are waiting for the finished Gradle Test Kit
.
Link to the project with tests and plugin: https://github.com/aatarasoff/documentation-plugin-demo
Source: https://habr.com/ru/post/282924/
All Articles