
Once in the icy winter season, they came to me as usual
with a soldering iron .
Task: we have a server in Rackspace,
full stuffing . On the
neonka server
, the base on MS SQL Web Edition is spinning. And then suddenly the boys realized - and where should the nightly backup go? And then he began to exceed all reasonable limits, and the benefits of storing it on the disk of the same instance - a little.
Like that wise owl from a joke, I suggested that they set up the cloud facilities from the console. And he was cruelly worn out with hot sharp objects - it turns out that the server there is not just old, but very old, it hangs there on some ancient plan that has nothing available, only Cloud Files. And change - it is necessary to go to the big boss of a respected office, which is 200 years on the market. And the big boss pays them a salary not for them to go to him for such things.
')
I was terribly surprised and asked - what is it so difficult - to make a script on my knee? And run it right after the database is backed up?
I was again popularly explained, using green images of dead presidents, that they have a respected office, they are 200 years old, this is not royal business to draw scripts. And since I have a lot of things that intersect with them for quite serious reasons, I had to help.
You never know, suddenly someone else will be useful, so I will tell you more about these scripts.
Training

The study manual tells us that to work with the cloud is enough to install Curl. However, we won’t believe them and put all the MSYS that comes with MinGW.
And fill in the file about the login:
<?xml version="1.0" encoding="UTF-8"?> <credentials xmlns="http://docs.rackspacecloud.com/auth/api/v1.1" username="LOGIN" key="RS API KEY"/>
We will feed this file to RexPase to get a token. Without such a token, we will not advance further.
Also, for the thrill of sensation, we will write DOS bat files so that the enemy does not guess. In fact, there is no point in inventing something more complicated; they all turn out in 5 lines. The structure of each of them is such
@echo off setlocal set PATH="%PATH%;C:\MinGW\msys\1.0\bin ... endlocal
therefore I will give it once.
Login

The token, which operates for as long as 24 hours, we get this:
curl -k -X POST -d @mylogin.xml -H "Content-Type: application/xml" -H "Accept: application/xml" https://auth.api.rackspacecloud.com/v1.1/auth > auth-token.xml cat auth-token.xml | sed -e 's/.*token id=.//' | sed -e 's/..expires=.*//g' > token
Severe mixture of styles - but we have the first script, it is login.bat. And - token as a result of his work. We can use the token in the future like this
set /p TOKEN=<token
Create a container for backups

Now we need to decide - where we will put backups and how exactly. The schemes may be different and quite arty, but we take the simplest case. Imagine for a moment that we have an ordinary differential backup, and all we need is to upload it once a day to a specific place.
set /p TOKEN=<token curl -k -X PUT -H "X-Auth-Token: %TOKEN%" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup
And we identified this place, the container is called
Backup . Their names - to taste.
Fill - first attempt

And now we will try to fill in there something:
set /p TOKEN=<token curl -# -k -X PUT -T %1 -H "Content-Type: application/octet-stream" -H "X-Auth-Token: %TOKEN%" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/%1
And what lies there?

Let us take a break for a moment, but as we shall see, what is there at all?
set /p TOKEN=<token curl -# -k -H "X-Auth-Token: %TOKEN%" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/%1
After launch, we will see a list of what is in the container, for example, like this
4.0.327-RC1.rar 5.0.709-RELENG-spfix.rar 5.5.907-spfix.rar foo.bak
What they had time to fill in at the last step lies, logically.
Download back

This is in general obvious - just a URL with a token.
set /p TOKEN=<token curl -# -o %1 -k -H "X-Auth-Token: %TOKEN%" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/%1
The name of the file is passed by parameter - it works!
And now - hunchback!

If it were only this, I would not write a post.
We implement it in production, that is, we copy the script to the server, and ...
ba-bam! post request size exceeded. It is logical - we filled small files for tests, and the database dump exceeds 50 gigabytes.
Uploading large files - attempt number two

Does this mean that we cannot fill the entire database dump?
Yes and no.
One request we can not fill it, it is a fact. But to get on the server one file in 50+ gigabytes - we can. You can fill in parts, and then combine these parts into one file.
For this, I will give the whole fragment, and then analyze it in more detail.
Hidden text set /p TOKEN=<token split -d --bytes=500m %1 %1.parts. ls %1.parts.* | awk 'BEGIN{x=0; print "@echo off"; }{ ++x; print "echo Uploading part " x "...\ncurl -# -k -X PUT -T " $1 " -H \"Content-Type: application/octet-stream\" -H \"X-Auth-Token: TOKEN\" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/FNN/" x " --data-binary \'" x "\'" }' | sed -es/FNN/%1/g | sed -es/TOKEN/%TOKEN%/g > _1.bat echo curl -# -k -X PUT -H "X-Auth-Token: TOKEN" -H "X-Object-Manifest: Backup/%1/" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/%1 --data-binary '' | sed -es/TOKEN/%TOKEN%/g >> _1.bat call _1.bat echo Clearing... rm -f _1.bat %1.parts.*
So, first we need to cut a large file into pieces:
split -d --bytes=500m %1 %1.parts.
After that, we will deal with hard work in severe form, namely, we will build a second script from our script, which will do all the work.

ls %1.parts.* | awk 'BEGIN{x=0; print "@echo off"; }{ ++x; print "echo Uploading part " x "...\ncurl -# -k -X PUT -T " $1 " -H \"Content-Type: application/octet-stream\" -H \"X-Auth-Token: TOKEN\" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/FNN/" x " --data-binary \'" x "\'" }' | sed -es/FNN/%1/g | sed -es/TOKEN/%TOKEN%/g > _1.bat
Simple trick
ls% 1.parts. * | awk we for each piece get your team. Here is such
Hidden text curl -# -k -X PUT -T foo.parts.0 -H "Content-Type: application/octet-stream" -H "X-Auth-Token: %TOKEN%" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/foo.0 --data-binary 0 curl -# -k -X PUT -T foo.parts.1 -H "Content-Type: application/octet-stream" -H "X-Auth-Token: %TOKEN%" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/foo.1 --data-binary 1 curl -# -k -X PUT -T foo.parts.2 -H "Content-Type: application/octet-stream" -H "X-Auth-Token: %TOKEN%" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/foo.2 --data-binary 2 ...
where foo is the name of the file passed to us in the comstar.
The final touch is to glue it all together:
echo curl -# -k -X PUT -H "X-Auth-Token: TOKEN" -H "X-Object-Manifest: Backup/%1/" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/%1 --data-binary '' | sed -es/TOKEN/%TOKEN%/g >> _1.bat
That will give us in the freshly batch file here is the last line:
curl -# -k -X PUT -H "X-Auth-Token: %TOKEN%" -H "X-Object-Manifest: Backup/foo/" https://storage101.dfw1.clouddrive.com/v1/MossoCloudFS_d3125ab9-8601-45ba-a432-edf3728673bb/Backup/%1 --data-binary
Profit! We did learn to upload large files to the cloud and get them from there.
I hope this flow of the Hindu code will be useful to someone.