itramblings

Ramblings from an IT manager and long time developer.

By

Pretty good backup script for linux folders

This was originally taken from here with some modifications by me

Automating backups with tar

It is always interesting to automate the tasks of a backup. Automation offers enormous opportunities for using your Linux server to achieve the goals you set. The following example below is our backup script, called backup.cron. This script is designed to run on any computer by changing only the five variables:

  1. COMPUTER
  2. DIRECTORIES
  3. BACKUPDIR
  4. TIMEDIR
  5. BACKUPSET

We suggest that you set this script up and run it at the beginning of the month for the first time, and then run it for a month before making major changes. In our example below we do the backup to a directory on the local server BACKUPDIR, but you could modify this script to do it to a tape on the local server or via an NFS mounted file system.

  1. Create the backup script backup.cron file, touch /etc/cron.daily/backup.cron and add the following lines to this backup file:
    #!/bin/sh
    # full and incremental backup script
    # created 07 February 2000
    # Based on a script by Daniel O'Callaghan <danny@freebsd.org>
    # and modified by Gerhard Mourani <gmourani@videotron.ca>
    # and modified by Shawn Anderson <sanderson@eye-catcher.com> on 2016-08-14
    #Change the 5 variables below to fit your computer/backup
    
    COMPUTER=$(hostname) # name of this computer
    BACKUPSET=HOMEDIR # name of the backup set
    DIRECTORIES="/home" # directories to backup
    BACKUPDIR=/backups # where to store the backups
    TIMEDIR=/backups/last-full # where to store time of full backup
    TAR=/bin/tar # name and location of tar
    
    #You should not have to change anything below here
    PATH=/usr/local/bin:/usr/bin:/bin
    DOW=`date +%a` # Day of the week e.g. Mon
    DOM=`date +%d` # Date of the Month e.g. 27
    DM=`date +%d%b` # Date and Month e.g. 27Sep
    
    #Set various things up
    
    # Is PV installed?
    type pv &gt;/dev/null 2>&1 || sudo apt-get install pv
    
    # Do the required paths exist
    if [ ! -d $BACKUPDIR ]; then
       mkdir $BACKUPDIR
    fi
    
    if [ ! -d $TIMEDIR ]; then
       mkdir $TIMEDIR
    fi
    
    # On the 1st of the month a permanent full backup is made
    # Every Sunday a full backup is made - overwriting last Sundays backup
    # The rest of the time an incremental backup is made. Each incremental
    # backup overwrites last weeks incremental backup of the same name.
    #
    # if NEWER = "", then tar backs up all files in the directories
    # otherwise it backs up files newer than the NEWER date. NEWER
    # gets it date from the file written every Sunday.
    
    # Monthly full backup
    if [ $DOM = "01" ]; then
       NEWER=""
       $TAR $NEWER cf - -C $DIRECTORIES/* | pv -s $(du -sb $DIRECTORIES | awk '{print $1}') | gzip > $BACKUPDIR/$BACKUPSET-$COMPUTER-$DM.tgz
    fi
    
    # Weekly full backup
    if [ $DOW = "Sun" ]; then
       NEWER=""
       NOW=`date +%d-%b`
    
       # Update full backup date
       echo $NOW &gt; $TIMEDIR/$COMPUTER-full-date
       $TAR $NEWER cf - -C $DIRECTORIES/* | pv -s $(du -sb $DIRECTORIES | awk '{print $1}') | gzip > $BACKUPDIR/$BACKUPSET-$COMPUTER-$DOW.tgz
    
    # Make incremental backup - overwrite last weeks
    else
       # Get date of last full backup
       NEWER="--newer `cat $TIMEDIR/$COMPUTER-full-date`"
       $TAR $NEWER cf - -C $DIRECTORIES/* | pv -s $(du -sb $DIRECTORIES | awk '{print $1}') | gzip > $BACKUPDIR/$BACKUPSET-$COMPUTER-$DOW.tgz
    fi
    
    # Remove backup files older than 90 days (this really shouldn't be necessary unless something
    # isn't right with the auto-rotation. I have it in just for good measures
    find $BACKUPDIR/$BACKUPSET-$COMPUTER* -mtime +90 -exec rm {} \;
    Example 33-1. Backup directory of a week
    Here is an abbreviated look of the backup directory after one week:

    total 22217
    -rw-r--r-- 1 root root 10731288 Feb 7 11:24 deep-HOMEDIR-01Feb.<b class="command">tar</b>
    -rw-r--r-- 1 root root 6879 Feb 7 11:24 deep-HOMEDIR-Fri.<b class="command">tar</b>
    -rw-r--r-- 1 root root 2831 Feb 7 11:24 deep-HOMEDIR-Mon.<b class="command">tar</b>
    -rw-r--r-- 1 root root 7924 Feb 7 11:25 deep-HOMEDIR-Sat.<b class="command">tar</b>
    -rw-r--r-- 1 root root 11923013 Feb 7 11:24 deep-HOMEDIR-Sun.<b class="command">tar</b>
    -rw-r--r-- 1 root root 5643 Feb 7 11:25 deep-HOMEDIR-Thu.<b class="command">tar</b>
    -rw-r--r-- 1 root root 3152 Feb 7 11:25 deep-HOMEDIR-Tue.<b class="command">tar</b>
    -rw-r--r-- 1 root root 4567 Feb 7 11:25 deep-HOMEDIR-Wed.<b class="command">tar</b>
    drwxr-xr-x 2 root root 1024 Feb 7 11:20 last-full
    

    Important: The directory where to store the backups BACKUPDIR, and the directory where to store time of full backup TIMEDIR must exist or be created before the use of the backup-script, or you will receive an error message.

  2. If you are not running this backup script from the beginning of the month 01-month-year, the incremental backups will need the time of the Sunday backup to be able to work properly. If you start in the middle of the week, you will need to create the time file in the TIMEDIR. To create the time file in the TIMEDIR directory, use the following command:
    [root@deep] /# date +%d%b < /backups/last-full/myserver-full-date

    Where /backups/last-full is our variable TIMEDIR wherein we want to store the time of the full backup, and myserver-full-date is the name of our server e.g. deep, and our time file consists of a single line with the present date i.e. 15-Feb.

  3. Make this script executable and change its default permissions to be writable only by the super-user root 755.
    [root@deep] /# chmod 755 /etc/cron.daily/backup.cron

Because this script is in the /etc/cron.daily directory, it will be automatically run as a cron job at one o’clock in the morning every day.

By

Force Windows 8 to show all user accounts on the sign on screen.

Taken from here: http://www.techrepublic.com/blog/windows-and-office/force-windows-8-to-show-all-user-accounts-on-the-sign-on-screen/

By

Powershell Script to Setup Internal/External URLS for Exchange 2013

Frm here: http://jaworskiblog.com/2013/04/13/setting-internal-and-external-urls-in-exchange-2013/

#
# Author: Scott Jaworski
# Website: jaworskiblog.com
# Version: 1.0
# Description: This script sets internal and external URL's on the specified Exchange 2013 Client Access Server
# then displays the results of all the urls that have been set.
# How to Use: Copy the text file to a location on the Exchange server. Change the .txt extension to .ps1,
# Open Exchange Management Shell, Browse to the location of the script in EMS, Run .Set-Exchange2013Vdirs
#

Function Set-Exchange2013Vdirs
{
$ExServer = Read-Host "Please enter the Exchange 2013 Server Name you'd like to set Vdirs  "
$InternalName = Read-Host "Input the internal domain name eg.. IntMail.domain.com  "
$ExternalName = Read-Host "Input the external domain name eg. ExtMail.domain.com  "

Write-Host "Configuring Directories for $ExServer.." -Foregroundcolor Green

Get-WebservicesVirtualDirectory -Server $ExServer | Set-WebservicesVirtualDirectory -InternalURL https://$InternalName/EWS/Exchange.asmx -ExternalURL https://$externalName/EWS/Exchange.asmx
Get-OwaVirtualDirectory -Server $ExServer | Set-OwaVirtualDirectory -InternalURL https://$InternalName/owa -ExternalURL https://$ExternalName/owa
Get-ecpVirtualDirectory -Server $ExServer | Set-ecpVirtualDirectory -InternalURL https://$InternalName/ecp -ExternalURL https://$ExternalName/ecp
Get-ActiveSyncVirtualDirectory -Server $ExServer | Set-ActiveSyncVirtualDirectory -InternalURL https://$InternalName/Microsoft-Server-ActiveSync -ExternalURL https://$ExternalName/Microsoft-Server-ActiveSync
Get-OABVirtualDirectory -Server $ExServer | Set-OABVirtualDirectory -InternalUrl https://$InternalName/OAB -ExternalURL https://$ExternalName/OAB
Set-ClientAccessServer $ExServer -AutodiscoverServiceInternalUri https://$internalName/Autodiscover/Autodiscover.xml
Set-OutlookAnywhere -Identity "$ExServerRpc (Default Web Site)" -InternalHostname $internalName -ExternalHostName $ExternalName -InternalClientAuthenticationMethod ntlm -InternalClientsRequireSsl:$True -ExternalClientAuthenticationMethod Basic -ExternalClientsRequireSsl:$True


Write-Host "Vdirs have been set to the following.." -Foregroundcolor Green
Write-Host "$ExServer EWS"
Get-WebservicesVirtualDirectory -Server $ExServer |Fl internalURL,ExternalURL
Write-Host "$ExServer OWA"
Get-OWAVirtualDirectory -Server $ExServer | Fl internalUrl,ExternalURL
Write-Host "$ExServer ECP"
Get-ECPVirtualDirectory -Server $ExServer | Fl InternalURL,ExternalURL
Write-Host "$ExServer ActiveSync"
Get-ActiveSyncVirtualDirectory -Server $ExServer | Fl InternalURL,ExternalURL
Write-Host "$ExServer OAB"
Get-OABVirtualDirectory -Server $ExServer | Fl InternalURL,ExternalURL
Write-Host "$ExServer Internal Autodiscover URL"
Get-ClientAccessServer $ExServer | Fl AutodiscoverServiceInternalUri
Write-Host "$Exserver Outlook Anywhere Settings"
Get-OutlookAnywhere -Identity "$ExServerrpc (Default Web Site)" |fl internalhostname,internalclientauthenticationmethod,internalclientsrequiressl,externalhostname,externalclientauthenticationmethod,externalclientsrequiressl

Write-Host "The Powershell URL have not been set as part of this script. Set it if you choose" -ForegroundColor Yellow
}
Set-Exchange2013Vdirs

By

Export SharePoint Terms Group to XML

Here is a PowerShell script to export SharePoint Term Groups to XML

 

param(
	[string]$siteUrl = "http://sharepoint.local:2013",
	[string]$termGroup = "Sample Term Group",
	[string]$exportPath = $null
)


function Add-Snapin {
	if ((Get-PSSnapin -Name Microsoft.Sharepoint.Powershell -ErrorAction SilentlyContinue) -eq $null) {
		$global:SPSnapinAdded = $true
		Write-Host "Adding SharePoint module to PowerShell" -NoNewline
		Add-PSSnapin Microsoft.Sharepoint.Powershell -ErrorAction Stop
		Write-Host " - Done."
	}

	Write-Host "Adding Microsoft.SharePoint assembly" -NoNewline
	Add-Type -AssemblyName "Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
	# Disable the above line and enable the line below for SharePoint 2013
	# Add-Type -AssemblyName "Microsoft.SharePoint, Version=15.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c"
	Write-Host " - Done."
}

function Remove-Snapin {
	if ($global:SPSnapinAdded -eq $true) {
		Write-Host "Removing SharePoint module from PowerShell" -NoNewline
		Remove-PSSnapin Microsoft.Sharepoint.Powershell -ErrorAction SilentlyContinue
		Write-Host " - Done."
	}
}

function Get-ScriptDirectory
{
	$Invocation = (Get-Variable MyInvocation -Scope 1).Value
	return Split-Path $Invocation.MyCommand.Path
}

function Export-SPTerms {
    param (
        [string]$siteUrl = $(Read-Host -prompt "Please provide the site collection URL"),
        [string]$termGroupName = $(Read-Host -prompt "Please provide the term group name to export"),
        [string]$saveLocation = $(Read-Host -prompt "Please provide the path of the folder to save the CSV file to")
    )

	if ([IO.Directory]::Exists($saveLocation) -eq $false)
	{
		New-Item ($saveLocation) -Type Directory | Out-Null
	}

	Write-Host "Getting Taxonomy Session";
	$taxonomySession = Get-SPTaxonomySession -site $siteUrl
	$taxonomyTermStore =  $taxonomySession.TermStores | Select Name
	$termStore = $taxonomySession.TermStores[$taxonomyTermStore.Name]
	$fileRootNoteCreated = $false;

	# Ampersands are stored as full width ampersands (see http://www.fileformat.info/info/unicode/char/ff06/index.htm)
	[Byte[]] $amp = 0xEF,0xBC,0x86

	Write-Host "Looping through Term store Groups to find: '$termGroupName'"
	foreach ($group in $termStore.Groups) {
		Write-Host "Checking: '$($group.Name)'"
		$groupName = $group.Name.Replace([System.Text.Encoding]::UTF8.GetString($amp), "&");
		if ($groupName -eq $termGroupName) {

			Write-Host "Looping through Term sets"
		    foreach ($termSet in $group.TermSets) {
            	# Remove unsafe file system characters from file name
				$parsedFilename =  [regex]::replace($termSet.Name, "[^a-zA-Z0-9\-]", "_")

				$file = New-Object System.IO.StreamWriter($saveLocation + "termset_" + $parsedFilename + ".xml")

		        # Write out the headers
		        #$file.Writeline("Term Set Name,Term Set Description,LCID,Available for Tagging,Term Description,Level 1 Term, Level 2 Term,Level 3 Term,Level 4 Term,Level 5 Term,Level 6 Term,Level 7 Term")
				$file.Writeline("<termStore Name='" + $termStore.Name + "' GUID='" + $termStore.ID + "' Group='" + $groupName + "'>");
		        $file.Writeline("`t<termSet Name='" + $termSet.Name + "' GUID='" + $termSet.ID + "' Description='" + $termSet.Description + "'>");
				try {
					Export-SPTermSet $termSet.Terms
				}
				finally {
					$file.Writeline("`t</termSet>");
					$file.Writeline("</termStore>");
			        $file.Flush()
			        $file.Close()
				}
			}
		}
	}
}

function Export-SPTermSet {
    param (
        [Microsoft.SharePoint.Taxonomy.TermCollection]$terms,
		[int]$level = 1,
		[string]$previousTerms = ""
    )

	$tabCount = $level+1;
	if ($level -gt 1) {$tabCount = $tabCount + ($level-1);}

	if ($terms.Count -gt 0)
	{
		$file.Writeline("`t" * $tabCount + "<terms>");
	}

	if ($level -ge 1 -or $level -le 7)
	{
		if ($terms.Count -gt 0 ) {
			$termSetName = ""
			if ($level -eq 1) {
				$termSetName =  """" + $terms[0].TermSet.Name.Replace([System.Text.Encoding]::UTF8.GetString($amp), "&") + """"
			}
			$terms | ForEach-Object {
				$termName = $_.Name.Replace([System.Text.Encoding]::UTF8.GetString($amp), "&");
				$currentTerms = $previousTerms + ",""" + $termName + """";

				$file.Writeline("`t" * $tabCount + "`t<term Name='" + $termName + "' isAvailableForTagging='" + $_.IsAvailableForTagging + "'>");
				$file.Writeline("`t" * $tabCount + "`t`t<description>" + $_.GetDescription() + "</description>");

				if ($level -lt 7) {
					Export-SPTermSet $_.Terms ($level + 1) ($previousTerms + $currentTerms)
				}
				$file.Writeline("`t" * $tabCount + "`t</term>");
			}
		}
	}

	if ($terms.Count -gt 0)
	{
		$file.Writeline("`t" * $tabCount + "</terms>");
	}
}

try {
	Write-Host "Starting export of Metadata Termsets" -ForegroundColor Green
	$ErrorActionPreference = "Stop"
	Add-Snapin

	if (!($exportPath)) {
		$exportPath = (Get-ScriptDirectory)
	}

	Write-Host "Site: $siteUrl" -ForegroundColor Yellow
	Write-Host "Term Group: $termGroup" -ForegroundColor Yellow
	Write-Host "Export Path: $exportPath" -ForegroundColor Yellow

	Export-SPTerms $siteUrl $termGroup $exportPath
}
catch {
	Write-Host ""
    Write-Host "Error : " $Error[0] -ForegroundColor Red
	throw
}
finally {
	Remove-Snapin
}
Write-Host Finished -ForegroundColor Blue

export-terms-xml.ps1 (5.31 kb)

By

JavaScript and jQuery – useful posts

http://www.quirksmode.org/js/associative.html

http://shikii.net/blog/organizing-jquery-projects-objects-and-namespaces/

http://javascriptweblog.wordpress.com/2010/12/07/namespacing-in-javascript/

By

Server 2008 and PowerShell published via OData

Repost from here

 

There is a new feature of Windows Server 8 that will allow for access to PowerShell cmdlets and objects via OData served through ASP.NET. Doug Finke wrote a blog post for PowerShell Magazine on the topic. The article gives a good overview of what the Management OData feature is and how to configure it. In this blog post I will be showing off some of the steps involved in getting the service configured and what it looks like to consume the OData in PowerShell.

Setting Up the Management OData Feature

The first step in utilizing the Management OData in Windows Server 2012 is to enable the feature. You can either use Server Manager or the following cmdlet.

Install-WindowsFeature -Name ManagementOData

Once the feature has been installed you will need to install Visual Studio 2011 Beta on the Server machine. I hope there is better tooling around this but currently this is what is required in order to build the samples found on MSDN. Make sure to download the Management OData Schema Designer. For a whole lot information on the topic, download and read the whitepaper. It walks you through all the steps I’m about to, in more detail.

What Management OData Does

In simple terms (read Doug’s post for more info), the Management OData service provides RESTful endpoints that server up PowerShell objects. The schema designer is used to map cmdlets and their resulting objects to OData objects. These can then be served as JSON back through the endpoint to the client.  The Management OData Schema Designer is used to take existing modules, cmdlets and objects and map them to XML files that can then be consumed by the Management OData system and served to clients. Included with the examples are PowerShell scripts used to install the OData endpoints once they have been compiled.

Installing the Basic Endpoint Sample

Once Visual Studio 2011 has been installed open the BasicPlugin.sln solution. You can download the sample here. Once the solution is open, build it. Once the solution is built, run the SetupEndpoints.ps1 file to configure the endpoint on the local server. The file is part of the Basic Endpoint solution folder. You should now be able to navigate to the URL:

http://localhost:7000/MODataSvc/Microsoft.Management.Odata.svc

This should result in this:

root

 

To query particular resources, you can now query it like so http://localhost:7000/MODataSvc/Microsoft.Management.Odata.svc/Process. This will return all the processes on the server machine. The Content property contains all the XML for the objects.

process

Additionally, processes can be filtered using a URI-based syntax. For example:

http://localhost:7000/MODataSvc/Microsoft.Management.Odata.svc/Process?$filter=(Handles gt 1000)

By default the returned format will be XML. In order to return JSON, you have to use the following syntax.

http://localhost:7000/MODataSvc/Microsoft.Management.Odata.svc/Process?$format=JSON

json

Remember that you can utilize the new ConvertFrom-Json cmdlet to to convert the JSON to objects.

results

Pretty powerful stuff!

Adding new Cmdlets and Objects

The Management OData Schema Designer is used to add more entities to the OData service. Once installed there should be an icon placed on the desktop. Open the designer. The first step is to load a module that you want to model and expose in OData. For this example I will use the NetAdapter module. Type the module name into the text box and click Load New Module. Once the module is loaded you will see a big list of the types of entities the designer finds within the module. These coincide with the nouns of the cmdlets within the module. The verbs Get, Set, New, Remove will appear as check boxes next to the noun names. If a cmdlet is not defined the verb will be grayed out.

In order to map a new OData action and entity, check one of the verb for a noun. I selected the Get and NetAdapter verb and noun. Next click the “from cmdlet output” button. The cmdlet will appear in the displayed box. Clicking Add-Type will add the new OData entity. In order to successfully generate the MOF and XML needed to define the object, you will need to set a Key property. This is the uniquely defining property on the object. Name is already selected for NetAdapter.

designer

Now you can click Generate Mof/Xml Schema. This will produce the mapping files that the Management OData service will use to translate between the REST request and the PowerShell cmdlet and resulting objects. Once saved, you can place this in C:inetpubwwwrootmodata.

Since the OData endpoint is constrained we need to play with the BasicPlugin a bit to get it to load the module we would like. In Visual Studio, I added the following lines to get the NetAdapter module to load into the runspace and to set the visibility of the proper cmdlets in the runspace. I just set them all to visible. Once built, copy the resulting DLL into the MOData folder and replace the one that is in there already. You may need to stop IIS first.

vs2011

Now you should be able to query to the location:

http://localhost:7000/MODataSvc/Microsoft.Management.Odata.svc/NetAdapter

Note that the resource identifier (e.g. NetAdapter) is case sensitive in the beta!

Remember there is also an Invoke-RestMethod cmdlet that can be used to query the entities. Using this method, it looks something like this.

results

I think this is insanely powerful functionality. It seems that the tooling isn’t quite 100% yet and requires quite a bit of setup to get running but the possibilities are endless. Creating RESTful services from PowerShell modules will be a cinch! I really encourage you to read the whitepaper about this that I mentioned earlier. It contains a ton of information.

 

 

 

 

 

 

 

 

 

 

 

By

The SharePoint & jQuery Guide

Repost from here

 

So, you’ve finally succumbed to the hype and decided to use jQuery in SharePoint? Good for you. I’m sure you are fully prepared with the knowledge of the pros and cons using jQuery as well as have all the requisite knowledge.

You should NOT be writing jQuery in SharePoint if…

You don’t? You mean you are going to copy and paste scripts from blogs and then ask the blogger to modify the script to work for your particular circumstance? Oh, that’s nice.

Well, assuming you actually want to understand what you are doing, maybe even learn a thing or two, I’ve decided to link to several of my past jQuery blogs in an order which will hopefully help you learn how to successfully use SharePoint and jQuery together. I’ll even do my best to keep this post updated as I write new blogs on the subject.

Good luck!

Before getting started…

Make sure you have some good background knowledge.

When you start using things like SPServices and the Client Object Model to query SharePoint list data you will need to understand SharePoint’s query language CAML. I always refer to it as the ugly crossbreed of XML and SQL.

If you plan to use SPServices to access SharePoint’s Web Services (which I also reference in several of the blogs below), you need to get that at http://spservices.codeplex.com.

When to Choose SPServices vs. the Client Side Object Model (CSOM)

Also, if you are wondering if you should use SPServices instead of the Client Object Model you can read the helpful post put together by Marc Anderson and myself.

Be sure to keep the jQuery.com site handy as this is by far the most up-to-date site to reference the jQuery api.

The Intro to jQuery and SharePoint blogs

Here are the blogs (in learning order) to help you deploy and get started using jQuery in SharePoint.

A Dummies Guide to SharePoint and jQuery–Getting Started

This blog assumes you know nothing. What is jQuery and how can I deploy it in SharePoint?

Undoubtedly one of the most common tasks you will perform with jQuery is getting and setting SharePoint form fields. This blog walks you through the process with the most common field types. A corresponding blog post to this one is Setting SharePoint Lookup Lists w/ jQuery (+/- 20 items) because unfortunately at some point you will fall victim to this SharePoint quirk.

Creating a SharePoint Parent/Child List Relationship–No SPD Version

So, now that you understand the basics of using jQuery in SharePoint and know how to get/set SharePoint form fields, this blog helps you apply that knowledge to perform the common task of creating an automatic parent/child list relationship. There are MANY ways of accomplishing this functionality, but I actually prefer this method. It may be a little more technical than the other solutions I’ve blogged about for creating a Parent/Child list relationship, but this solution has the least impact on SharePoint and should also upgrade easily.

Okay, at this point you should be ready to start interacting with SharePoint list data using SPServices. This blog post walks you through the basics with a VERY commented script. If you think you have your head wrapped around reading list data with SPServices, you might also check out the following blog posts using SPServices to accomplish some tasks normally achieved using server side code:

Using SPServices & jQuery to Clone a Parent Item and All its Children – Reads items from a SharePoint List and then creates copies of those items.

Using SPServices & jQuery to Find My Stuff from Multi-Select Person/Group Field – Determines the groups a current user belongs to in order to determine if the current user is part of a group or person assigned to a list item.

A Dummies Guide to SharePoint & jQuery – An Easy to Use Content Slider

Now that jQuery and SPServices is old hat, I walk you through the process of integrating SharePoint List Data into a third party jQuery library.

Other SharePoint tips and tricks using jQuery

So, here is a smattering of other blog posts on the subject which you may find helpful, or might give you ideas for your projects. Some of them are just academic, and some you can put into practice immediately.

Using jQuery to Print a Web Part – In this blog I use a very simple third party library to print a specific web part and not the entire page.

Using jQuery to Make a Field Read Only in SharePoint – Another tip you could use to make a SharePoint form field read only.

SharePoint List Views –Quick & Easy jQuery Filter – Using a very simple script, add a filter box to an out of the box list view that filters the rows based upon what the user enters.

Using Google Maps to Populate an Address in a SharePoint New Item Form – Before I started using the Bing Map api, I played around with the Google Maps API.

A Dummies’ Guide to SharePoint and jQuery–A Real World Example – A mostly academic post on the types of things you can do with jQuery to modify a page in the _Layouts directory that you don’t’ have direct access to.

Essential Links for the SharePoint Client Side Developer – This is a list of a lot of the jQuery/JavaScript libraries I use along with some suggestions from others. I actually need to update this list soon.

So, What’s next?

What? Is this not enough? I’d say for the price you paid, it’s a bargain!!

I hope to do a few blog entries on more advanced topics like Callbacks, templates, and design patterns. Also, as we move towards SharePoint vNext I do think it might be a good idea to start to wean myself off of SPServices and start using the Client Object Model more (although nothing leads me to believe that SPServices will not work in vNext). So, look for me to start a series on the Client Object Model as well and provide any tips or tricks I learn along the way.

As always, let me know what YOU want to learn and I’ll see what I can do! Thanks again for stopping by.

By

Using Powershell to play MP3 Audiobooks on Windows Phone

Re-post from here

 

So about a year ago I changed my phone to a Windows mobile 7. Overall I’m really happy with it. Without setting the world alight with evangelistic wars, I think that if the Metro interface had come before the iPhone arrived then it’s likely it would be the dominant UI.

Anyway, one major problem I had arrived when it came to playing audiobooks. The issue was that I had three challenges when playing on the windows phone.

a) For space considerations, I wanted “read” chapters to be removed from the phone automatically

b) Some books have hundreds of chapters, but some have 12 chapters each >1hr in length. It was almost impossible to pick up where you left off. Therefore I needed to use the podcast “resume” functionality in Zune

c) This led to problem #3 – when I converted the “genre” using MP3Tagedit to podcast, they would frequently end up in a jumbled play-order depending on their date, so I needed to then ensure that Chapter1 had an earlier timestamp than chapter 2 etc. to force the right order in WP7. I could eventually get what I wanted but it was very painful and involved a bunch of manual steps as well as using BulkEditor and MP3Tagedit.

I found part of the answer in an excellent post over here, where it referenced TAGLIB# . So I launched powershell for the first time and wrote the script below.

Now, if you target it at a folder such as…

William Shakespeare/

Merchant of Venice/

Chapter1.mp3 ……

Hamlet/

Chapter1.mp3…..

It will loop through the subdirectories… renaming the files in sequence… setting the appropriate MP3 tag entries… Author = William Shakespeare/Album = BookDirectoryName/Chapter = Title.. and setting timestamp to force Zune to recognise chapter 1 as being before chapter 2..

So, here’s the code I used:

Function Zune
{
	[CmdletBinding()]
	param(
		[Parameter(Position=0, Mandatory=$True)]
		[ValidateNotNullOrEmpty()]
		[System.String]
		$Library,

		[Parameter(Position=1)]
		[ValidateNotNull()]
		[System.String]
		$Genre
	)

	cls

	$OriginDir = Get-Location

	#========Does the libary exist and contain books?=============================================

	if ((Test-Path -Path $Library) -eq $false) {
		write-output "$Library Does Not Exist"
		Break
	}

	$Books = Get-ChildItem $Library | where {$_.psiscontainer}
	if (!$books) {
		Write-Output "$Library Contains no books"
		Break
	}

	$librarylocation = Get-Item $Library

	fault Parameters===========================================================
	if (!$Genre)
	{
		$Genre="Podcast"
		Write-Host "Genre is blank, using Default $Genre" -foregroundcolor DarkGreen
	}
	else
	{
		Write-Output "Genre is " + $Genre
	}

	#========Load Assemblies======================================================================
	$TaglibLocation = (Resolve-path ($ProfileDir + "taglibtaglib-sharp.dll"))
	Add-Type -Path ($TaglibLocation)
	Write-output "Taglib Loaded Successfully"

	#========Set up Recursive Subdirectories======================================================
	Write-Output "Processing the Library $Library"
	$NumberBooks = $books | Measure-Object | Select-Object -ExpandProperty Count
	Write-Output "Total Number of Runs is $Numberbooks"

	if ($NumberBooks -gt 0)
	{
		Set-Location $Librarylocation
		foreach ($book in $Books)
		{
			Write-Host "==================================================" -foregroundcolor Blue
			Write-Output "$("Processing the Book ")$($book.name)$(" in ")$($librarylocation)"
			$Chapters = Get-ChildItem $book.name *.mp3 | Sort-Object Name
			$NumberChapters = $Chapters | Measure-Object | Select-Object -ExpandProperty Count

			if ($NumberChapters -gt 0)
			{

				if ($NumberChapters -le 100)
				{
					$pad = 2
				}
				elseif ($NumberChapters -le 255)
				{
					$pad = 3
				}
				elseif ($NumberChapters -ge 256)
				{
					Write-Output "$("Too Many chapters in the book :")$($book.name)"
					Break
				}

				Write-Output "$("Processing ")$($NumberChapters)$(" Chapters")"

				#========Set up Recursive Subdirectories==========================================
				$book.fullname
				set-location $book.fullname
				$i=1
				foreach ($chapter in $Chapters)
				{
					$title = "{0:D$pad}" -f ($i) +"$(" of $NumberChapters- ")$($book.name)"
					$Filename = $title + '.mp3'
					$chapter.fullname
					$media = [TagLib.File]::Create($chapter.Fullname)
					$media.Tag.Title = $title
					$media.Tag.Performers = "Narrator"
					$media.Tag.AlbumArtists = (get-culture).Textinfo.Totitlecase($Library.tolower())
					$media.Tag.Composers = "Composer"
					$media.Tag.Album = $book.name
					$media.Tag.Comment = $chapter.Name
					$media.Tag.Genres = $Genre

					if (!$media.Tag.Year) {$media.tag.year = (Get-date).Year}
					$media.Tag.Track = $i
					$media.Tag.Artists = (get-culture).Textinfo.Totitlecase($Library.tolower())

					$media.Save()

					$chapter.Creationtime = (Get-Date).date.addminutes($i)
					$chapter.LastWriteTime = (Get-Date).date.addminutes($i)
					#$chapter.LastAccessTime = (Get-Date).date.addminutes($i)
					$chapter.LastWriteTimeUTC = (Get-Date).date.addminutes($i)
					$chapter.CreationtimeUTC = (Get-Date).date.addminutes($i)
					#$chapter.LastAccessTimeUTC = (Get-Date).date.addminutes($i)
					Write-Output "$("Renaming ")$($chapter.name)$(" to ")$($Filename)"
					Rename-Item $chapter.fullname $filename

					$i++
				}
			}
			else
			{
				Write-Output "$("No Chapters inside the book :")$($book)"
			}

		Set-Location $Librarylocation
		}

		#===============================================================================================
		#========================     end of loop on books =============================================
		#===============================================================================================
	}
	else
	{
		Write-Output "$($"No Books inside the Library "$($Library))"
	}

	#===================================================================================================
	#========================     end of function     ==================================================
	#===================================================================================================
	Set-Location $OriginDir
}

Export-ModuleMember -Function Zune

By

Articles about Templates and jQuery

http://weblogs.asp.net/dwahlin/archive/2011/11/23/reducing-javascript-code-by-using-jsrender-templates-in-html5-applications.aspx

By

Dynamically adding JavaScript script tags at runtime

This is a report from here

First Let’s discuss about the issue of loading all the javascript files together at a stroke.

  1. 1. Javascript will not allow parallel download. So the browser will not start any other download until it loads the javascripts.
  2. 2. We use to load javascript files and methods which are unnecessary for that page. For eg: In Login page, we just need some javascripts essential for Login page, but developers will add all the javascripts in common headers.

 

<script language =”javascript” src=”scriptRequiredForUserProfile.js”></script>
<script language =”javascript” src=”scriptRequiredForAddComment.js”></script>
<script language =”javascript” src=”scriptRequiredForReply.js”></script>

<script language =”javascript” src=”scriptRequiredForAddComment.js”></script>

<script language =”javascript” src=”scriptRequiredForReply.js”></script>

How to solve the above 2 issuesjQuery15204168177171158769_1337868541848?

  • Include only one script in the header.

 

<script src="”loadJS.js”"></script>


 

Now we can load the javascript dynamically using 2 ways.

 

Method 1:

 

Generate dynamic // <![CDATA[
tag using DOM

 

Maintain the list of javascripts that you need to download for a respective page.

 

 

var jsFilesArray = new Array();
jsFilesArray['home.html'] = new Array( ‘validation.js’, ‘home.js’ );
jsFilesArray['login.html'] = new Array( ‘validation.js’, ‘error.js’ );

Code to get File Name of the current page

 

 

function getCurrentPageName() {
    var fileName = document.location.href;
    var end = (fileName.indexOf("?") == -1) ? fileName.length : fileName.indexOf("?");
    return fileName.substring(fileName.lastIndexOf("/")+1, end);
}

Code to generate dynamic tag and append to the head tag.

 

 

var head = document.getElementsByTagName("head")[0];

for (var i = 0;i<jsFilesArray[getFileName()].length;i++)
{
  script = document.createElement('script');
  script.id = "id_" + i;
  script.type = 'text/javascript';
  script.src = jsFilesArray[getCurrentPageName ()][i];
  head.appendChild(script);
}

 

 

Click to download the code.

 

Method 2:

 

Use AJAX to load the JS file dynamically whenever required

 

Get the required JS file using xmlHttPRequest and execute the output by passing it into eval() method. The eval() method executes the argument.

 

So guys, load your Javascripts dynamically and experience improved performance of your application.

 

Enjoy Coding :)