📜 ⬆️ ⬇️

Recipe for capistrano - import production base on the developer machine

Sometimes a situation arises when you need to get a production base to yourself, for development and testing.
I wrote this recipe as part of the heme, now it works only if the base for production and development is the same (in our case, postgresql). If you wish, you can add other databases.

For this purpose (as for others), Capistrano is perfect for us.

In order to use the recipe, you need to install 7zip on the server and local machine:
sudo apt-get --assume-yes install p7zip-full 


Add this code to Capfile:
...
 mac_os = `uname -a`.include?("Darwin") run_local_sudo_psql = mac_os ? "psql -h localhost -U postgres" : "#{sudo} -u postgres psql" run_local_psql = mac_os ? "psql -h localhost" : "psql" database_yml_path = "config/database.yml" config = YAML::load(capture("cat #{current_path}/#{database_yml_path}")) adapter = config[rails_env]["adapter"] database = config[rails_env]["database"] db_username = config[rails_env]["username"] db_password = config[rails_env]["password"] config = YAML::load(File.open(database_yml_path)) local_rails_env = 'development' local_adapter = config[local_rails_env]["adapter"] local_database = config[local_rails_env]["database"] local_db_username = config[local_rails_env]["username"] local_db_password = config[local_rails_env]["password"] set :local_folder_path, "tmp/backup" # you can set where to store *.sql.7z files on dev machine set :timestamp, Time.now.to_i # timestamp in seconds set :db_file_name, "#{database}-#{timestamp}.sql" # etc. app_production-1332404980.sql set :db_archive_ext, "7z" namespace :db do task :backup do file_name = fetch(:db_file_name) archive_ext = fetch(:db_archive_ext) dump_file_path = "#{shared_path}/backup/#{file_name}" output_file = "#{dump_file_path}.#{archive_ext}" run "mkdir -p #{shared_path}/backup" if adapter == "postgresql" run %{export PGPASSWORD="#{db_password}" && pg_dump -U #{db_username} #{database} > #{dump_file_path}} run "cd #{shared_path} && 7z a #{output_file} #{dump_file_path} && rm #{dump_file_path}" else puts "Cannot backup, adapter #{adapter} is not implemented for backup yet" end system "mkdir -p #{local_folder_path}" download(output_file, "#{local_folder_path}/#{file_name}.#{archive_ext}") end task :import do file_name = "#{db_file_name}.#{db_archive_ext}" file_path = "#{local_folder_path}/#{file_name}" if local_adapter == "postgresql" system "cd #{local_folder_path} && 7z x #{file_name}" system %{echo "drop database #{local_database}" | #{run_local_sudo_psql}} system %{echo "create database #{local_database} owner #{local_db_username};" | #{run_local_sudo_psql}} puts "ENTER your development password: #{local_db_password}" system "#{run_local_psql} -U#{local_db_username} #{local_database} < #{local_folder_path}/#{db_file_name}" system "rm #{local_folder_path}/#{db_file_name}" else puts "Cannot import, adapter #{local_adapter} is not implemented for backup yet" end end task :pg_import do if adapter == local_adapter db.backup db.import else puts "Sorry, but development and production adapters must be equals" end end end 

')

How it works:


cap db: pg_import

  1. creates a folder on the server "# {project_name} / shared / backup"
  2. makes a dump of the production base into this folder, archives it 7zip
  3. sends via ssh to the local machine in the project folder 'tmp / backup'
  4. extracts and imports into the development base IMPORTANT : before doing this, make DROP the current local database

Source: https://habr.com/ru/post/140492/


All Articles