This article is not aimed at solving any fundamental problems, but it helps, with a minimum of effort, to automate the building of packages for your public repository.
Let's say you build ten packages for your favorite users, each for two distributions (say, Debian unstable and Ubuntu jaunty), and each for two architectures (amd64 and i386). Remember, we
learned about the wonderful tool for creating repositories - reprepro. So, since then, he never learned to include packs in packs in the repository - only one at a time. This means that you have to enter your gpg-signature 10 * 2 * 2 = 40 times. And how to effectively collect these 10 packages, especially if they are updated daily?
In fact, my situation was even worse. In total, when updating the qutIM repository, I had to enter my not short gpg-key as many as 176 (one hundred seventy-six) times. I must say that by the time when I finally got tired of it and overcame my laziness, I carried out this procedure in just 3 minutes - one second to enter. But we will not torture you, we will immediately correct the situation. So, install the gnupg-agent package and edit your
~ / .bashrc file , adding the following lines at the end:
# # GnuPG # gpg-agent --daemon --enable-ssh-support --write-env-file " ${HOME} /.gpg-agent-info" if [ -f " ${HOME} /.gpg-agent-info" ] ; then . " ${HOME} /.gpg-agent-info" export GPG_AGENT_INFO export SSH_AUTH_SOCK export SSH_AGENT_PID fi
# # GnuPG # gpg-agent --daemon --enable-ssh-support --write-env-file " ${HOME} /.gpg-agent-info" if [ -f " ${HOME} /.gpg-agent-info" ] ; then . " ${HOME} /.gpg-agent-info" export GPG_AGENT_INFO export SSH_AUTH_SOCK export SSH_AGENT_PID fi
# # GnuPG # gpg-agent --daemon --enable-ssh-support --write-env-file " ${HOME} /.gpg-agent-info" if [ -f " ${HOME} /.gpg-agent-info" ] ; then . " ${HOME} /.gpg-agent-info" export GPG_AGENT_INFO export SSH_AUTH_SOCK export SSH_AGENT_PID fi
# # GnuPG # gpg-agent --daemon --enable-ssh-support --write-env-file " ${HOME} /.gpg-agent-info" if [ -f " ${HOME} /.gpg-agent-info" ] ; then . " ${HOME} /.gpg-agent-info" export GPG_AGENT_INFO export SSH_AUTH_SOCK export SSH_AGENT_PID fi
# # GnuPG # gpg-agent --daemon --enable-ssh-support --write-env-file " ${HOME} /.gpg-agent-info" if [ -f " ${HOME} /.gpg-agent-info" ] ; then . " ${HOME} /.gpg-agent-info" export GPG_AGENT_INFO export SSH_AUTH_SOCK export SSH_AGENT_PID fi
# # GnuPG # gpg-agent --daemon --enable-ssh-support --write-env-file " ${HOME} /.gpg-agent-info" if [ -f " ${HOME} /.gpg-agent-info" ] ; then . " ${HOME} /.gpg-agent-info" export GPG_AGENT_INFO export SSH_AUTH_SOCK export SSH_AGENT_PID fi
# # GnuPG # gpg-agent --daemon --enable-ssh-support --write-env-file " ${HOME} /.gpg-agent-info" if [ -f " ${HOME} /.gpg-agent-info" ] ; then . " ${HOME} /.gpg-agent-info" export GPG_AGENT_INFO export SSH_AUTH_SOCK export SSH_AGENT_PID fi
# # GnuPG # gpg-agent --daemon --enable-ssh-support --write-env-file " ${HOME} /.gpg-agent-info" if [ -f " ${HOME} /.gpg-agent-info" ] ; then . " ${HOME} /.gpg-agent-info" export GPG_AGENT_INFO export SSH_AUTH_SOCK export SSH_AGENT_PID fi
# # GnuPG # gpg-agent --daemon --enable-ssh-support --write-env-file " ${HOME} /.gpg-agent-info" if [ -f " ${HOME} /.gpg-agent-info" ] ; then . " ${HOME} /.gpg-agent-info" export GPG_AGENT_INFO export SSH_AUTH_SOCK export SSH_AGENT_PID fi
# # GnuPG # gpg-agent --daemon --enable-ssh-support --write-env-file " ${HOME} /.gpg-agent-info" if [ -f " ${HOME} /.gpg-agent-info" ] ; then . " ${HOME} /.gpg-agent-info" export GPG_AGENT_INFO export SSH_AUTH_SOCK export SSH_AGENT_PID fi
Now when you start the local console, you will run a gpg-agent that “remembers” the gpg-key you entered for a while. Voila, we got rid of one big problem.
What's next? Next build packages. I will tell you about it using the example of the same my favorite qutIM: among the packages I collect there are those that are almost daily updated from svn, and there are those that were downloaded once, and since then they are just rebuilt when updating kernel qutIM.
For a start, I’ll tell you a little about how the directory and package naming structure is organized in me, to make it clearer.
At the root of the build directory, I initially only have a build script and directories of those packages that are not updated from svn, for example, qutim-plugin-floaties. At the root of the build directory, all source directories are siphoned from svn and the collected packages are put. Even in this directory, I have a wonderful subdirectory of controls, in which are the debian-folders for packages. For example, to find the debian-folder for the qutim-protocol-jabber package, I call controls / qutim-protocol-jabber / debian /
Well, in the rep / directory I have repository directories for different distros - rep / jaunty /, rep / unstable /, etc.
From this we will make a start.
')
So, let's start with the caps of our shell script, which will build:
- #! / bin / bash
- BUILDERRORS = "" # variable where build errors are written
- MAJORVERSION = "0.2a" # main version of each package
- DISTRIBUTIONS = "unstable testing stable jaunty" # repository names
Now let's write two small simple functions, one of which will simply build a package from the current directory for all distributions and architectures, and the second will update package versions in all our pbuilder shells.
- build_it ( ) {
- OLDDIR = ` pwd`
- cd $ 1
- for i in $ DISTRIBUTIONS ; do
- for j in i386 amd64; do
- DIST = $ i ARCH = $ j pdebuild - --basetgz / var / cache / pbuilder / $ i - $ j -base.tgz
- if [ [ $? ! = "0" ] ]
- then
- BUILDERRORS = " $ BUILDERRORS
- $ 1 - $ i - $ j "
- fi
- done
- done
- cd $ OLDDIR
- }
- update_it ( ) {
- echo "Updating pbuilder environments ..."
- for i in $ DISTRIBUTIONS ; do
- for j in i386 amd64; do
- DIST = $ i ARCH = $ j pbuilder --update --basetgz / var / cache / pbuilder / $ i - $ j -base.tgz
- if [ [ $? ! = "0" ] ]
- then
- BUILDERRORS = " $ BUILDERRORS
- Updating - $ i - $ j "
- fi
- done
- done
- }
It's simple, isn't it? The first function will receive the name of the directory as input, go there and try to build a package for all architectures. The second one doesn't need anything at all; it just updates all the pbuilder shells. Note that both commands in the case of errors do nothing, just append the message to the error log.
Now we need to make a universal function that will collect our bags. All packages differ from each other by name, version and place from which to get them. This will use. First of all, in those debian / changelog files that I have stored in the controls folder, I have instead written the version “--VERSION--” everywhere. And secondly, I wrote such a wonderful little function:
- make_package ( ) {
- _REV = $ 1
- _URL = $ 2
- _NAME = $ 3
- _DIR = $ 4
- _BREV = $ 5
- DIRNAME = ""
- PACKAGENAME = ""
- if [ [ $ _URL ! = "" ] ]
- then
- rm -rf " $ _DIR "
- export $ _REV = ` LANG = C svn export $ _URL $ _DIR | awk '$ 1 == "Exported" && $ 2 == "revision" {print $ 3}' | sed 's /.$ //' `
- if [ [ $ {! _ REV} = "" ] ]
- then
- echo "Error! Can't get $ _NAME revision!" && exit 1
- fi
- echo "Received $ _NAME revision: $ {! _ REV} "
- else
- export $ _REV = "1"
- fi
- export REVISION = " $ {! _ REV} "
- DIRNAME = " $ {_ DIR} - $ {_ BREV} . $ {! _ REV} "
- PACKAGENAME = " $ {_ DIR} _ $ {_ BREV} . $ {! _ REV} "
- echo " $ _NAME directory is: $ DIRNAME "
- rm -rf " $ {PACKAGENAME} .orig.tar.gz" " $ DIRNAME "
- cp -r " $ _DIR " " $ DIRNAME "
- tar czf " $ {PACKAGENAME} .orig.tar.gz" " $ DIRNAME "
- echo " $ _NAME source archive is: $ {PACKAGENAME} .orig.tar.gz"
- cp -r "controls / $ {_ DIR} / debian" " $ DIRNAME /"
- sed -i "s # - VERSION - # $ {_ BREV} . $ {! _ REV} -1 #" " $ DIRNAME / debian / changelog"
- echo "Building $ _NAME ..."
- build_it " $ DIRNAME "
- }
Let's take a closer look at what this function does. At the input, we pass in succession five parameters:
- variable name, where to put the number of the downloaded revision;
- the address from where to download the source;
- the name of the package, how to call it in log messages;
- debian package name;
- Major version number
The function is universal - if the URL is passed, the source files are downloaded from it and the .orig.tar.gz archive is automatically created. If not transmitted, it is created from an existing directory.
Now the package can be assembled in one simple line:
- make_package "QREV" "qutim.org/svn/qutim" "qutIM" "qutim" " $ MAJORVERSION "
What is the very first QREV parameter passed for? When assembling the package, we give qutIM the “older” version “0.2a” and get the version of the package itself, for example, “0.2a.294-1”. If I build a protocol package, I want its version to include the kernel version:
- make_package "MRIM_REV" "qutim.org/svn/mrim" "MRIM" "qutim-protocol-mrim" " $ MAJORVERSION . $ QREV "
At the output we get the version "0.2a.294.357-1". Isn't it convenient? Thus, we always know which version of the source we received.
If we have a package locally and I simply reassemble it, without pumping out svn - it doesn’t matter, just pass an empty parameter as a URL:
- make_package "FLOATIES_REV" "" "Floaties" "qutim-plugin-floaties" " $ MAJORVERSION . $ QREV "
We have only one thing left - to update the packages in the repository, put it from the build machine to the server and let us know when assembling which packages we had errors:
- echo "Copying packages to local repository ..."
- for j in $ DISTRIBUTIONS ; do
- for i in ` ls / var / cache / pbuilder / $ j / result / * amd64.changes`; do
- reprepro -b rep / $ j / --ignore = wrongdistribution -C main include $ j $ i
- done
- for i in ` ls / var / cache / pbuilder / $ j / result / * _i386.deb`; do
- reprepro -b rep / $ j / --ignore = wrongdistribution -C main includedeb $ j $ i
- done
- done
- echo "Updating main repository ..."
- scp -r rep / * remotehost: / path / to / repo /
- echo "Done."
- if [ " $ BUILDERRORS " ] ;
- then
- echo "There were errors in:"
- echo " $ BUILDERRORS "
- fi
Notice, since we “screwed” the gpg-agent to our console, we no longer need the --ask-passphrase key for reprepro.
All these wonderful scripts not only allow me to add new packages to the repository now by simply preparing the debian / directory for the package and one line in the script, but they also do not require my attention. To build, I now run my script, and I'm going about my business. Returning a few hours later I enter the gpg key once and the password from the remote repository server once. Simple and convenient.
Ps Information for those who like to experiment: besides the convenient reprepro, there is also the official dak utility, which is used for repositories in at least Debian. I never used it, because for work it uses PostgreSQL and an incorrect python language (I won’t give a holivarit about it, and don’t ask for it). But anyone can try.
Pps For those who want to use scripts in their own, I suggest that I use the full script I use. Edit for yourself on health:
pastebin.com/f3289286a