Never been to TextSnippets before?

Snippets is a public source code repository. Easily build up your personal collection of code snippets, categorize them with tags / keywords, and share them with the world (or not, you can keep them private!)

« Newer Snippets
Older Snippets »
12 total  XML / RSS feed 

Postgresql dump SS archive script

Dump, bzip, and upload a postgres database to strongspace. Via Ubernostrum


FILENAME=b_list-`/usr/xpg4/bin/date +%Y%m%d`.sql.bz2
cd /home/myuser/dumps
/opt/local/bin/pg_dump -U db_username db_name | /usr/bin/bzip2 > $FILENAME
/usr/bin/scp $FILENAME me@strongspace:/my/backup/dir/

backup files

// create backup copy


date=`date +%Y%m%d`

usage () {
        echo "Usage: `basename $0` filename"

if [ -z "$filename" -a ! -f "$filename" ]; then
        exit 1


while [ -f $backup ]; do
        let rev+=1

cp $filename $backup
exit $?

mysql backup / restore

// description of your code here
exports given database to a textfile which can then be zipped and downloaded or sent somewhere (mailing pending)
mysqldump -u <username> -p -q --single-transaction <db_name> > <backup_filename>

To restore the database (be sure to create the target DB before running this)
mysql -u <username> -p <db_name> < <backup_filename>

To backup and restore data only, use -t, no table data
mysqldump -u <username> -p -q --single-transaction -t <db_name> > <backup_filename>

Then restore from command line as usual.
mysql -u <username> -p <db_name> < <backup_filename>

Tar up, excluding common media extensions

Just the code, please! Hop in the dir you want to back up (note: this means it'll expand without a base directory, so do a mkdir/cd first!)

find . ! -type d -print | egrep '/,|%$|~$|\.old$|\.mpg$|\.zip$|\.wmv$|\.mp3$|\.MP3$|\.mp4$|\.MP4$|\.av2$|\.ppt$|\.dir$|\.pps$|\.qt$|SCCS|/core$|\.o$|\.orig$|\.mpeg$|\.mov$|\.doc$|\.xls$|\.pdf$|\.swf$|\.fla$|\.wav$|\.aif$|\.aiff$|\.mp3$|\.jpg$|\.JPG$|\.jpeg$|\.JPEG$|\.gif$|\.GIF$|\.png$|\.PNG$|\.psd$|\.PSD$|\.tar.gz$|\.tgz$|\.TGZ$|\.tif$|\.TIF$|\.tiff$|\.TIFF$|\.tga$|\.TGA$|\.ram$|\.rm$|\.rma$|\.psd$|\.PSD$|\.ai$|\.AI$' > Exclude
tar czfvX ~/backup_text_back.tgz Exclude .

Web and database backup

mysqldump -u root -p —all-databases > /path/to/backups/YYYY-MM-DD-alldbs.sql
scp /path/to/backups/YYYY-MM-DD-alldbs.sql.gz [email protected]:/path/to/backups
tar czvf /path/to/backups/YYYY-MM-DD-production.tgz /path/to/production
scp /path/to/backups/YYYY-MM-DD-production.tgz [email protected]:/path/to/backups



Mount Strongspace to a folder in Ubuntu

1) Install the software
sudo apt-get install sshfs

2) Add fuse to /etc/modules
sudo nano /etc/modules

3) Add yourself to the 'fuse' group, then log out and log in again.
sudo adduser your-username fuse

4) Create a mountpoint and give yourself ownership
sudo mkdir /media/mount-name
sudo chown your-username /media/mount-name

5) Mount the filesystem
sshfs remote-system-name:/remote-folder /media/mount-name

6) Unmount the filesystem
fusermount -u /media/mount-name

Directions lifted from Ubuntu forums and here also.

For myself, I had better results running the following command in the same directory that the file that I mounted resides in...

sudo sshfs folder_name

Where folder_name is the name of the folder that you are mounting strongspace to.

Backup from textdrive to strongspace

# backup script for textdrive to strongspace based on these articles:

# change this!
PWD_MYSQL=somepassword # chmod 700 this file

# Backup database 
# copy this line for more databases or 
# use --all-databases if you are the main user
# Don't forget to change the database name
/usr/local/bin/mysqldump --opt --skip-add-locks --user=$USER --password=$PWD_MYSQL database1 | gzip > $HOME/backups/database1_`date "+%Y-%m-%d"`.gz
/usr/local/bin/mysqldump --opt --skip-add-locks --user=$USER --password=$PWD_MYSQL database2 | gzip > $HOME/backups/database2_`date "+%Y-%m-%d"`.gz

# Backup subversion (Only works with FSFS)
cd $HOME
tar -z -c -f $HOME/backups/svn_`date "+%Y-%m-%d"`.tar.gz svn

# Add custom dirs here, if you need it, just like the svn example above
# I just keep everything I need in subversion

# Delete old backups
cd $HOME/backups/
/usr/bin/find *.gz -mtime +8 -delete

# Send it to strongspace
/usr/local/bin/rsync -azq --delete -e "ssh -i $HOME/.ssh/ss" $HOME/backups/*.gz $USER_SS@$$USER_SS/txd-backup/

Backup with RAR + winrar.exe
В поле "Параметры" у меня находится следующая строка: "u -ibck -m0 -ag[dd-mm] -r -y "I:_backuparchives1Canddocs.rar" @I:_backuparchivesbackup.lst". Разберемся по-порядку: "u" означает "Добавить в архив с обновлением старых файлов", "-ibck" - запустить WinRAR как фоновый процесс, дабы не мозолить нам ясны очи своими окошками. Параметр "-m0" отвечает за метод сжатия данных, 0 в этой команде означает без сжатия, вместо нуля могут быть цифры от 1 до 5, соотвественно степени сжатия. Я оставляю ноль, т.к. при этом на бэкап тратится значительно меньше времени, система не подтормаживает и вообще, восстановить архив потом значительно легче. Далее параметр "-ag[dd-mm]", он добавляет к имени архива текущую дату. Параметр "-r" нужен для того, чтобы в архив добавлялись также и подпапки, т.е. рекурсивное чтение папок. "-y" - значит отвечать автоматически на все вопросы в ходе архивации "да". Далее в кавычках вводится путь к файлу бэкап-архива. Затем с символом "@" - путь к списку файлов, подлежащих бэкапу. Данный список можно написать в обычном Блокноте, в нем нужно указать пути ко всем файлам и папкам. По одной строчке на путь.

Daily rotated backups to Strongspace


# backup folder
MYSQL_DATABASES="db1 db2 dbn"
DIRS="dir1 dir2 dirn"
[email protected]

BACKUP_ALT=$(python -c "import time, sys; sys.stdout.write(str(int(time.strftime('%d')) % 2))")

for d in MYSQL_DATABASES; do
mysqldump --quick --opt --skip-add-locks $d |gzip - >${DEST}/${d}.sql.gz

for d in DIRS; do
tar zcvf ${HOME}${DEST}/${d}.tar.gz ${HOME}/${d} >/dev/null

scp -r ${DEST}/* ${STRONGSPACE_USER}:backup/$BACKUP_ALT/

rm -rf $DEST/*

Upload your Flickr photos to Strongspace

require 'net/http'
require 'rubygems'
require_gem 'flickr'
require_gem 'net-sftp'

flickr_username = "[email protected]"
flickr_pass = 'x'
strongie_pass = 'x'
strongie_username = 'johan'
strongie_upload_dir = "flickr_test"

flickr = 
flickr.login(flickr_username, flickr_pass)
user = flickr.users(flickr_username)

Net::SFTP.start("#{strongie_username}", strongie_username, strongie_pass) do |sftp|
  Net::HTTP.start('') do |http| do |photo|
      src_url = photo.source('Large').sub("", '')    
      puts "Fetching \"#{photo.title}\"..."
      res = http.get(src_url)
      filename = File.basename(src_url)
      sftp.open_handle("/home/#{strongie_username}/#{strongie_upload_dir}/#{filename}", 'w') do |handle|
        result = sftp.write(handle, res.body)
        puts "Wrote #{filename} with result code: #{result.code}..."

Make a backup of all tables startin with...

This will backup all tables starting with "prefix_" to a gzipped file backup.sql.gz.

echo "SHOW TABLES" | mysql -uUSERNAME -pPASSWORD -D DBNAME | grep ^prefix_ | xargs mysqldump -uUSERNAME -pPASSWORD DBNAME | gzip -c > backup.sql.gz

You have to replace USERNAME, PASSWORD and DBNAME twice.

Note: On textdrive you want to add --skip-opt to mysqldump, otherwise the command will abort with an error because privs for lockings tables are missing (for me at least).

To check which tables were backed up, you can use:
zcat test.sql.gz | grep CREATE

Read to find out which other options to add to mysqldump

Backing up subversion repositories from a remote machine

# Author: Jacques Marneweck 
# License: PHP License v3.0

LOCALVER=`/usr/local/bin/svnlook youngest /home/svn/livejournal`
REMOTEVER=`/usr/bin/ssh jacques@hostname /usr/local/bin/svnlook youngest /home/svn/livejournal`
echo "Local version is ${LOCALVER}"
echo "Remote version is ${REMOTEVER}"

if [ "$REMOTEVER" -gt "$LOCALVER" ];
  echo "Remote version is greater than local version"
  START=$(echo "${LOCALVER} + 1" | /usr/bin/bc -l)
  /usr/bin/ssh jacques@hostname /usr/local/bin/svnadmin dump --incremental --deltas --revision ${START}:${REMOTEVER} /path/to/repo | /usr/local/bin/svnadmin load --ignore-uuid /path/to/repo
  echo "Both local and remote version have the same data"
« Newer Snippets
Older Snippets »
12 total  XML / RSS feed