itramblings

Ramblings from an IT manager and long time developer.

By

Pretty good backup script for linux folders

This was originally taken from here with some modifications by me

Automating backups with tar

It is always interesting to automate the tasks of a backup. Automation offers enormous opportunities for using your Linux server to achieve the goals you set. The following example below is our backup script, called backup.cron. This script is designed to run on any computer by changing only the five variables:

  1. COMPUTER
  2. DIRECTORIES
  3. BACKUPDIR
  4. TIMEDIR
  5. BACKUPSET

We suggest that you set this script up and run it at the beginning of the month for the first time, and then run it for a month before making major changes. In our example below we do the backup to a directory on the local server BACKUPDIR, but you could modify this script to do it to a tape on the local server or via an NFS mounted file system.

  1. Create the backup script backup.cron file, touch /etc/cron.daily/backup.cron and add the following lines to this backup file:
    #!/bin/sh
    # full and incremental backup script
    # created 07 February 2000
    # Based on a script by Daniel O'Callaghan <danny@freebsd.org>
    # and modified by Gerhard Mourani <gmourani@videotron.ca>
    # and modified by Shawn Anderson <sanderson@eye-catcher.com> on 2016-08-14
    #Change the 5 variables below to fit your computer/backup
    
    COMPUTER=$(hostname) # name of this computer
    BACKUPSET=HOMEDIR # name of the backup set
    DIRECTORIES="/home" # directories to backup
    BACKUPDIR=/backups # where to store the backups
    TIMEDIR=/backups/last-full # where to store time of full backup
    TAR=/bin/tar # name and location of tar
    
    #You should not have to change anything below here
    PATH=/usr/local/bin:/usr/bin:/bin
    DOW=`date +%a` # Day of the week e.g. Mon
    DOM=`date +%d` # Date of the Month e.g. 27
    DM=`date +%d%b` # Date and Month e.g. 27Sep
    
    #Set various things up
    
    # Is PV installed?
    type pv &gt;/dev/null 2>&1 || sudo apt-get install pv
    
    # Do the required paths exist
    if [ ! -d $BACKUPDIR ]; then
       mkdir $BACKUPDIR
    fi
    
    if [ ! -d $TIMEDIR ]; then
       mkdir $TIMEDIR
    fi
    
    # On the 1st of the month a permanent full backup is made
    # Every Sunday a full backup is made - overwriting last Sundays backup
    # The rest of the time an incremental backup is made. Each incremental
    # backup overwrites last weeks incremental backup of the same name.
    #
    # if NEWER = "", then tar backs up all files in the directories
    # otherwise it backs up files newer than the NEWER date. NEWER
    # gets it date from the file written every Sunday.
    
    # Monthly full backup
    if [ $DOM = "01" ]; then
       NEWER=""
       $TAR $NEWER cf - -C $DIRECTORIES/* | pv -s $(du -sb $DIRECTORIES | awk '{print $1}') | gzip > $BACKUPDIR/$BACKUPSET-$COMPUTER-$DM.tgz
    fi
    
    # Weekly full backup
    if [ $DOW = "Sun" ]; then
       NEWER=""
       NOW=`date +%d-%b`
    
       # Update full backup date
       echo $NOW &gt; $TIMEDIR/$COMPUTER-full-date
       $TAR $NEWER cf - -C $DIRECTORIES/* | pv -s $(du -sb $DIRECTORIES | awk '{print $1}') | gzip > $BACKUPDIR/$BACKUPSET-$COMPUTER-$DOW.tgz
    
    # Make incremental backup - overwrite last weeks
    else
       # Get date of last full backup
       NEWER="--newer `cat $TIMEDIR/$COMPUTER-full-date`"
       $TAR $NEWER cf - -C $DIRECTORIES/* | pv -s $(du -sb $DIRECTORIES | awk '{print $1}') | gzip > $BACKUPDIR/$BACKUPSET-$COMPUTER-$DOW.tgz
    fi
    
    # Remove backup files older than 90 days (this really shouldn't be necessary unless something
    # isn't right with the auto-rotation. I have it in just for good measures
    find $BACKUPDIR/$BACKUPSET-$COMPUTER* -mtime +90 -exec rm {} \;
    Example 33-1. Backup directory of a week
    Here is an abbreviated look of the backup directory after one week:

    total 22217
    -rw-r--r-- 1 root root 10731288 Feb 7 11:24 deep-HOMEDIR-01Feb.<b class="command">tar</b>
    -rw-r--r-- 1 root root 6879 Feb 7 11:24 deep-HOMEDIR-Fri.<b class="command">tar</b>
    -rw-r--r-- 1 root root 2831 Feb 7 11:24 deep-HOMEDIR-Mon.<b class="command">tar</b>
    -rw-r--r-- 1 root root 7924 Feb 7 11:25 deep-HOMEDIR-Sat.<b class="command">tar</b>
    -rw-r--r-- 1 root root 11923013 Feb 7 11:24 deep-HOMEDIR-Sun.<b class="command">tar</b>
    -rw-r--r-- 1 root root 5643 Feb 7 11:25 deep-HOMEDIR-Thu.<b class="command">tar</b>
    -rw-r--r-- 1 root root 3152 Feb 7 11:25 deep-HOMEDIR-Tue.<b class="command">tar</b>
    -rw-r--r-- 1 root root 4567 Feb 7 11:25 deep-HOMEDIR-Wed.<b class="command">tar</b>
    drwxr-xr-x 2 root root 1024 Feb 7 11:20 last-full
    

    Important: The directory where to store the backups BACKUPDIR, and the directory where to store time of full backup TIMEDIR must exist or be created before the use of the backup-script, or you will receive an error message.

  2. If you are not running this backup script from the beginning of the month 01-month-year, the incremental backups will need the time of the Sunday backup to be able to work properly. If you start in the middle of the week, you will need to create the time file in the TIMEDIR. To create the time file in the TIMEDIR directory, use the following command:
    [root@deep] /# date +%d%b < /backups/last-full/myserver-full-date

    Where /backups/last-full is our variable TIMEDIR wherein we want to store the time of the full backup, and myserver-full-date is the name of our server e.g. deep, and our time file consists of a single line with the present date i.e. 15-Feb.

  3. Make this script executable and change its default permissions to be writable only by the super-user root 755.
    [root@deep] /# chmod 755 /etc/cron.daily/backup.cron

Because this script is in the /etc/cron.daily directory, it will be automatically run as a cron job at one o’clock in the morning every day.

By

Export SharePoint Terms Group to XML

Here is a PowerShell script to export SharePoint Term Groups to XML

 

param(
	[string]$siteUrl = "http://sharepoint.local:2013",
	[string]$termGroup = "Sample Term Group",
	[string]$exportPath = $null
)


function Add-Snapin {
	if ((Get-PSSnapin -Name Microsoft.Sharepoint.Powershell -ErrorAction SilentlyContinue) -eq $null) {
		$global:SPSnapinAdded = $true
		Write-Host "Adding SharePoint module to PowerShell" -NoNewline
		Add-PSSnapin Microsoft.Sharepoint.Powershell -ErrorAction Stop
		Write-Host " - Done."
	}

	Write-Host "Adding Microsoft.SharePoint assembly" -NoNewline
	Add-Type -AssemblyName "Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
	# Disable the above line and enable the line below for SharePoint 2013
	# Add-Type -AssemblyName "Microsoft.SharePoint, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
	Write-Host " - Done."
}

function Remove-Snapin {
	if ($global:SPSnapinAdded -eq $true) {
		Write-Host "Removing SharePoint module from PowerShell" -NoNewline
		Remove-PSSnapin Microsoft.Sharepoint.Powershell -ErrorAction SilentlyContinue
		Write-Host " - Done."
	}
}

function Get-ScriptDirectory
{
	$Invocation = (Get-Variable MyInvocation -Scope 1).Value
	return Split-Path $Invocation.MyCommand.Path
}

function Export-SPTerms {
    param (
        [string]$siteUrl = $(Read-Host -prompt "Please provide the site collection URL"),
        [string]$termGroupName = $(Read-Host -prompt "Please provide the term group name to export"),
        [string]$saveLocation = $(Read-Host -prompt "Please provide the path of the folder to save the CSV file to")
    )

	if ([IO.Directory]::Exists($saveLocation) -eq $false)
	{
		New-Item ($saveLocation) -Type Directory | Out-Null
	}

	Write-Host "Getting Taxonomy Session";
	$taxonomySession = Get-SPTaxonomySession -site $siteUrl
	$taxonomyTermStore =  $taxonomySession.TermStores | Select Name
	$termStore = $taxonomySession.TermStores[$taxonomyTermStore.Name]
	$fileRootNoteCreated = $false;

	# Ampersands are stored as full width ampersands (see http://www.fileformat.info/info/unicode/char/ff06/index.htm)
	[Byte[]] $amp = 0xEF,0xBC,0x86

	Write-Host "Looping through Term store Groups to find: '$termGroupName'"
	foreach ($group in $termStore.Groups) {
		Write-Host "Checking: '$($group.Name)'"
		$groupName = $group.Name.Replace([System.Text.Encoding]::UTF8.GetString($amp), "&");
		if ($groupName -eq $termGroupName) {

			Write-Host "Looping through Term sets"
		    foreach ($termSet in $group.TermSets) {
            	# Remove unsafe file system characters from file name
				$parsedFilename =  [regex]::replace($termSet.Name, "[^a-zA-Z0-9\-]", "_")

				$file = New-Object System.IO.StreamWriter($saveLocation + "termset_" + $parsedFilename + ".xml")

		        # Write out the headers
		        #$file.Writeline("Term Set Name,Term Set Description,LCID,Available for Tagging,Term Description,Level 1 Term, Level 2 Term,Level 3 Term,Level 4 Term,Level 5 Term,Level 6 Term,Level 7 Term")
				$file.Writeline("<termStore Name='" + $termStore.Name + "' GUID='" + $termStore.ID + "' Group='" + $groupName + "'>");
		        $file.Writeline("`t<termSet Name='" + $termSet.Name + "' GUID='" + $termSet.ID + "' Description='" + $termSet.Description + "'>");
				try {
					Export-SPTermSet $termSet.Terms
				}
				finally {
					$file.Writeline("`t</termSet>");
					$file.Writeline("</termStore>");
			        $file.Flush()
			        $file.Close()
				}
			}
		}
	}
}

function Export-SPTermSet {
    param (
        [Microsoft.SharePoint.Taxonomy.TermCollection]$terms,
		[int]$level = 1,
		[string]$previousTerms = ""
    )

	$tabCount = $level+1;
	if ($level -gt 1) {$tabCount = $tabCount + ($level-1);}

	if ($terms.Count -gt 0)
	{
		$file.Writeline("`t" * $tabCount + "<terms>");
	}

	if ($level -ge 1 -or $level -le 7)
	{
		if ($terms.Count -gt 0 ) {
			$termSetName = ""
			if ($level -eq 1) {
				$termSetName =  """" + $terms[0].TermSet.Name.Replace([System.Text.Encoding]::UTF8.GetString($amp), "&") + """"
			}
			$terms | ForEach-Object {
				$termName = $_.Name.Replace([System.Text.Encoding]::UTF8.GetString($amp), "&");
				$currentTerms = $previousTerms + ",""" + $termName + """";

				$file.Writeline("`t" * $tabCount + "`t<term Name='" + $termName + "' isAvailableForTagging='" + $_.IsAvailableForTagging + "'>");
				$file.Writeline("`t" * $tabCount + "`t`t<description>" + $_.GetDescription() + "</description>");

				if ($level -lt 7) {
					Export-SPTermSet $_.Terms ($level + 1) ($previousTerms + $currentTerms)
				}
				$file.Writeline("`t" * $tabCount + "`t</term>");
			}
		}
	}

	if ($terms.Count -gt 0)
	{
		$file.Writeline("`t" * $tabCount + "</terms>");
	}
}

try {
	Write-Host "Starting export of Metadata Termsets" -ForegroundColor Green
	$ErrorActionPreference = "Stop"
	Add-Snapin

	if (!($exportPath)) {
		$exportPath = (Get-ScriptDirectory)
	}

	Write-Host "Site: $siteUrl" -ForegroundColor Yellow
	Write-Host "Term Group: $termGroup" -ForegroundColor Yellow
	Write-Host "Export Path: $exportPath" -ForegroundColor Yellow

	Export-SPTerms $siteUrl $termGroup $exportPath
}
catch {
	Write-Host ""
    Write-Host "Error : " $Error[0] -ForegroundColor Red
	throw
}
finally {
	Remove-Snapin
}
Write-Host Finished -ForegroundColor Blue

export-terms-xml.ps1 (5.31 kb)

By

Sudo for Windows

With the existance of UAC in the windows world, I find my self looking for easier ways to run a command as administrator (usually via command line) without needing to turn off UAC.

Here are some useful links that helped me solve this very issue

Elevation Power Toys

http://technet.microsoft.com/en-us/magazine/2008.06.elevation.aspx  (note, you also need to sysinternals suite installed in “%ProgramFiles%Sysinternal Suite”

Elevate Utility

http://code.kliu.org/misc/elevate/

Sudo for windows

http://sourceforge.net/projects/sudowin/

By

Xml Formatting

Looking for a way to format XML?

http://www.bytechaser.com/en/resources/tp9h7nivzr/free-online-xml-formatting-tool.aspx

— some others 

http://stackoverflow.com/questions/521265/any-online-xml-formatter-or-formatter-in-free-text-editor

http://www.shell-tools.net/index.php?op=xml_format

http://aaronkarp.com/prettyXML/

By

Put any application in the system tray

http://www.blogtechnika.com/how-to-send-any-application-to-system-tray-in-windows-7

By

StickOut: A Desktop Sticky Notes Application in the .NET Framework 2.0

Summary: StickOut is a desktop sticky note with multi-user support and Outlook integration. As a .NET Framework 2.0 Windows Forms application, it uses .NET Remoting to communicate with other StickOut users and exchange sticky notes with them. (85 pages)

Download the associated StickOut.msi code sample.

Original Article: http://msdn.microsoft.com/en-us/library/aa480731.aspx

By

A couple of useful free tools

FreeCommanderis an easy-to-use alternative to the standard windows file manager. The program helps you with daily work in Windows. Here you can find all the necessary functions to manage your data stock. You can take FreeCommander anywhere – just copy the installation directory on a CD or USB-Stick – and you can even work with this program on a foreign computer.

FuturixImager is a compact and customizable image viewer. It is capable of opening more then 40 file types, including all most popular ones (GIF, JPEG, PNG, TIFF, JPEG2000, raw, DNG).

Texter saves you countless keystrokes by replacing abbreviations with commonly used phrases you define. Unlike software-specific text replacement features, Texter runs in the Windows system tray and works in any application you’re typing in. Texter can also set return-to markers for your cursor and insert clipboard contents into your replacement text, in addition to more advanced keyboard macros.

Stickies is a PC utility was written to try to cut down on the number of yellow notes I was leaving stuck to my monitor. It is a computerised version of those notes. The design goal behind Stickies is that the program is small and simple. Stickies will not mess with your system files, or write to the registry. Stickies stores information in a single text-based ini file. 

ShellRunAs is a command-line tool call Runas that is handy for launching programs under different accounts, but it’s not convenient if you’re a heavy Explorer user. ShellRunas provides functionality similar to that of Runas to launch programs as a different user via a convenient shell context-menu entry. Read more..

Command Prompt Here tool

By

Great low cost backup tool (online and offline)

http://b4.crashplan.com/consumer/index.html

 

Truly dependable backup means backing up to multiple locations – not just online – which until now could be complicated. CrashPlan automatically backs up to multiple destinations for FREE!

CrashPlan’s groundbreaking social backup concept makes it easy to back up to computers belonging to your network of friends or family for offsite backup, in addition to using your own computers and external drives for onsite backup. CrashPlan works on all your computers, so you don’t have to worry about compatibility either.

CrashPlan is true backup; uncomplicated, reliable and even a little fun.