📜 ⬆️ ⬇️

Invisible friends of your github repository

image
Github is an indispensable tool, firmly embedded in the lives of almost every developer.

Although many of us use it all the time, not everyone knows that there are a large number of third-party (and free) services and tools that are tightly integrated with github and extend its functionality.

In this article, we will focus mainly on tools running in the npm infrastructure. For a complete list of services that integrate with github, see the github integrations directory .
')

Today in issue:





Most of the services reviewed provide informational badges that can be added to the project page. In order not to show the same image every time a badge is mentioned, we show it once at the beginning of the article:

image

We also need to write a few scripts. To do this, we will use npm scripts, avoiding tools like grunt or gulp, so you don’t need to know the specific collectors.

Small lyrical digression about task runners
Not so long ago, several clever people (for example, the notorious Cory House ) voiced the idea that it is not at all necessary to use tools like grunt or gulp to build a project. In most cases, simple npm scripts are enough.

We are convinced by personal experience that this thought is correct.

If you build your project, for example, with the help of gulp, then for each tool used you have to look for (or write yourself) a plugin. For example, to compile typescript you need to use gulp-typescript , to run tslint you need gulp-tslint , to run typedoc , you need gulp-typedoc . Well, and so on.

Tools are periodically updated and sometimes you are really looking forward to the update. It happens that plugins are updated with some delay. And it happens that they are not updated at all.

For example, the long-awaited release of typescript 2.x. Not long to wait for tslint and typedoc updates. But the new versions of plug-ins in no hurry to go. As a result, the project was impossible to translate into a new version of typescript because of the old version of the gulp-plugin for the general auxiliary tool.

In addition, plugins are additional dependencies in your package.json, and they often have more dependencies, and so on. All this directly affects the installation time, for example, when starting the build in travis.

In general, our experience has shown that if you rewrite the assembly to npm-scripts, then life becomes easier.

And it's also very cool not to use tools that require global installation. But this idea we will not discuss in detail and just share a link to a good article for those interested.


Live examples of the use of the described services can be viewed in our repositories of the e2e4 library (simple options) or the angular-grid libraries right-angled (more interesting options).

We configure continuous integration with travis ci


Let's start with the obvious - each project needs continuous integration. Here travis ci is ready to help us.

Setting up a build in travis is quite simple and consists of the following steps:

  1. Login to travis using your github account.
  2. We indicate on the settings page which of our travis repositories should be monitored.
  3. Add the ".travis.yml" file to the root directory of the repository, which contains the environment settings and commands to start the build.

For example, like this:

language: node_js node_js: - "6" script: - npm run ci 


So we tell travis that we need an environment with nodejs version 6.

The build process consists of a single npm command that runs a script called “ci” from the “scripts” section of the package.json file.

In our case, this team takes turns running the lint project with tslint , building with typescript , and running the tests with karma . If details are interesting to you, then it is possible to look at them in package.json on github .

Also note that in the build commands we did not register “npm install”. Travis himself understands that it is necessary to install the dependencies through npm and executes it.

Moreover, if you use yarn, then travis will understand this, install yarn and install the dependencies using it.

In travis laid many similar patterns that save us from unnecessary actions.

Now we execute the push file “.travis.yml ', after which the first build will be executed. The build process can be observed in real time on the travis page.

What else



image




Configure test coverage reports with coveralls


The next service we get to know is coveralls .

The general idea is that when you run tests, you generate a coverage report in the lcov format and send it to the coveralls service for analysis. The service processes it and provides the following features:


image



To connect coveralls to your project, you must perform the following steps:

  1. Log in to coveralls with your github account.
  2. Select repositories for which you want to enable information collection.
  3. Install the npm coveralls package into your repository.
  4. Adjust the generation of coverage reports when running the tests.
    We will not consider how to generate reports on coverage in detail, because it all depends on what you write the tests on, what tools you use, and how the tests run. Plus, for example, for typescript, there are no tools for generating coverage reports, since javascript, rather than typescript, is launched for execution. And, if you want to watch the coverage of exactly typescript code, then you will need tools to remap the reports of js-code coverage back to the typescript code. All this is unnecessary specificity, which we try to avoid in this article. You can start exploring possible options from the coveralls npm-package page on github.


  5. Add npm-script to transfer the generated report to the coveralls service.
     "scripts": { "coveralls": "cat ./coverage/lcov.info | ./node_modules/.bin/coveralls", ... } 

  6. Run the added script after running the build in travis. To do this, use the after_success section in the travis.yml file. Now our travis.yml looks like this:
     language: node_js node_js: - "6" script: - npm run ci after_success: - npm run coveralls 



What else


In addition to github, coveralls integrates with bitbucket and promises fast gitlab support. It also supports integration with many services of integration integration and many development platforms.


Monitor dependency status with david


The david dependency monitoring service will be the simplest one discussed in the article.

The idea of ​​the service is simple - we add to the project page a badge, which is an indicator of the status of project dependencies. It is also a link to the project dependency analysis page.

To connect to our project david, type in the browser address in the following pattern:

 https://david-dm.org/< >/< > 


Depending on what types of dependencies are in your project, on the page that opens you will see the tabs “dependencies”, “devdependencies”, “peer dependencies” and “optional dependencies”.

Each tab contains a list of dependencies of a certain type and badge, by clicking on which you can copy the link in markdown or HTML format and place it in your readme file or on the project page in github pages.

What else


There is also a david cli utility designed to help update dependencies on your machine. True, we have not been able to understand what its advantages are compared with the npm outdated and npm update built-in commands.


Configuring automatic updating of dependencies with greenkeeper


The greenkeeper service is also designed to help us with the difficult task of updating dependencies, but it does so at a much higher level. His task is to completely free us from work with dependencies.

To connect greenkeeper to our project, you need to go to the greenkeeper page in the section public integrations on github and install the application in the repository we need.

In a minute, greenkeeper will create a pull request in which it will update all the dependencies of your project and add a readme badge that displays the status of greenkeeper.

Only if you make a merge of this pull request, will greenkeeper start monitoring your project.

Further, when new versions of dependencies appear, greenkeeper will create pull requests in which it will provide a detailed description of what has been updated and, if possible, provide a list of changes in the new version. You just have to make a merge.

image

Naturally, the greenkeeper connection only makes sense if you have an automatic build configured and there are tests with which you can at least nominally check that your project is in working condition after the upgrade. Greenkeeper recognizes whether there are builds and tests in your repository, and writes a warning in the text of the pull request, if it does not detect one or the other.

What else





Improving commit messages with commitizen


Commitizen is a whole set of tools that help in writing meaningful messages to commits.

In addition to being more informative for people interested in your repository, using commitizen is another plus. From the messages generated by commitizen, you can easily collect changelog and release notes. And there are even utilities that analyze the history of commits and suggest what the next version number should be to comply with the semver convention. But more about that in the next section.

And we begin with the simplest. Install cz-cli . After installation, we can use the “git cz” command instead of “git commit”. This command launches a wizard, which takes us through a series of questions about what we have changed in the current commit and generates a commit message in a format that is easily readable by humans and parsed by various tools.

So:
  1. Set commitizen globally:
     npm install commitizen -g 
  2. We make our commitizen friendly repository. To do this, run the following command in the repository folder:
     commitizen init < > --save-dev --save-exact 


Instead of <adapter name>, you must specify one of the possible adapters . In the world of front end development, the most popular is the cz-conventional-changelog, the following message notation developed by the angular team

This command will install the necessary dependencies, save them in the devDependencies section of the package.json file and prescribe the necessary settings there.

What else





We generate changelog and release notes with conventional-changelog


After we connect commitizen, we get another option - the ability to configure the automatic generation of changelog and release notes.

It will help us in this conventional-changelog .
A conventional-changelog is a whole family of tools, with which you can build a release release using high-level tools using ready-made templates, or you can build from individual tools what you need specifically.

The developers themselves recommend using the standard-version , which is a relatively high-level tool.

Even higher level is semantic-release . This tool even makes push changes and publishes the version in npm.

From our experience - having tried both options, we stopped at the use of lower-level tools.

The reason for this choice is that standard-version, for example, does not push changes and does not generate a release description on github. Adding such things to the release process, plus setting up via standard-version options for running the build, or generating documentation, together require configuration, which is comparable in complexity to manual setting up the publishing process based on the npm version command.

Semantic-release, on the other hand, is trying to automate the process entirely and to use it you need a very disciplined approach to development. The complexity of its settings is also comparable to the manual configuration of the process. And the last. Semantic-release does npm publish, which limits its use only for libraries distributed via npm, and you can publish versions not only for libraries.

So, we will build the release process based on the npm version. Also, when you run the script, version npm runs the preversion and postversion scripts, if any. We will use both. The release process will consist of the following steps:

Phase preversion:


  1. Cleaning directories with previous build results with rimraf .
  2. Tslint run
  3. Compile typescript.
  4. Run tests with karma .

Using the example of the e2e4 repository, these steps are identical to the precommit executed in the hook, so we use the same script:

 { "preversion": "npm run precommit", "precommit": "npm run rimraf -- esm coverage && npm run clean:src && npm run clean:tests && npm run lint && npm run build && npm run test" } 


Phase version:


  1. Generate documentation for publication on github pages. In our case, typedoc is used for this. Regarding github pages - recently it has become unnecessary to create a branch named gh-pages, you can simply specify the folder of your project in the github settings that will be used as a site. By default, this is the “docs” folder. Therefore, we simply generate the documentation in the docs folder.
  2. Add the generated documentation to git for commit.
  3. Supplement changelog.md with change information. For this we will use conventional-changelog-cli . As parameters, we give it the file name and the convention for parsing messages. In our case it is “angular”.
  4. Add updated changelog for commit.
  5. Update the version in package.json and add it to the commit. This npm will do for us.
  6. Generate a tag for the version and put a pre-release or latest tag on it, if necessary. This npm will also do for us. What label to put it will understand the version number, in accordance with the rules of semver .


So, we get the following set of scripts:
 { "version": "npm run docs && git add -A docs && npm run changelog && git add CHANGELOG.md", "changelog": "npm run conventional-changelog -- -p angular -i CHANGELOG.md -s", "docs": "npm run rimraf -- docs && typedoc --options typedoc.json src/" } 


Postversion phase:


  1. We push the modified package.json, changelog.md, documentation.
  2. We execute push created tag.
  3. We add the description of changes in github release. To do this, use conventional-github-releaser, passing it as a parameter message notation. In our case it is “angular”. Because you need the appropriate access rights to record information, you need to generate an access token, which conventional-github-releaser will use for authorization. How this is done can be found in the project description.


Script for version:
 { "postversion": "git push && git push --tags && conventional-github-releaser -p angular", } 


Our release process is ready.

Run the script with the command:
 npm version <   major/minor/patch> 


The generated release notes look like this:

image

The content in changelog.md is almost the same, so we will not list it.

What else




Manage tasks with zube


About the service zube.io we will not talk a lot. Each of us worked with task trackers and zube is not much different from many of them.

We name only two of its advantages, because of which we decided to mention it in this article:

  1. It is free for open source projects, which is rare for project management tools that integrate with github.
  2. zube allows you to combine tasks from several github repositories into one project and work with them simultaneously.


This concludes our review. If you are aware of other interesting services or tools for working with github, please share your knowledge in the comments.

Thank you for your attention.

The cover image of the article is the Benevocats by cameronmcefee.

Source: https://habr.com/ru/post/323760/


All Articles