If you need to deploy your app to Azure cloud-services you’ll notice the installation drive will yo-yo between E: and F: each time you deploy. These drives are always created with rather limited capacity – roughly 1.5GB.
Certain Sitecore operations push a lot of data to disk, things like: logging, media caches, icons, installation history and lots more. As part of your deployment I’d suggest you setup the following changes:
- Move the data folder to another location
- This can be patched in with the sc.variable dataFolder
- Move the media cache
- This can be patched in with setting Media.CacheFolder
- Remember to also update the cleanup agents which look to delete old file based media entries. These are referenced in the CleanupAgent.
- Move the temp folder
- Now, this isn’t quite so simple. The reason for noting this item is installing update files creates large __UpgradeHistory folders. Also, things like icons are cached in the /temp folder. For this reason /temp needs to live as /temp under the site root. With some help from support they concluded that rather than a virtual directory, a symlink was the best approach. For details on setting one up via powershell, see http://stackoverflow.com/questions/894430/powershell-hard-and-soft-links. As part of our startup scripts we included calling the script to shift things around.
- The clue a folder is a symlink is the icon will change ala:
Azure allows you to pipe loads of stats and diagnostic information to blob and table storage. Google for “WAD Azure” and you’ll find a lot more information.
Its great for logging things like the event queue, performance counters and loads more over time.
If you want to push Sitecore counters through to this you are in luck. Make sure the counters are installed on the box – for web-roles this can be done via a startup task.
Then update your ‘diagnostics.wadcfg’ config to include the counters you want – make sure you include the (*)!
<!-- updated as per https://sdn.sitecore.net/Articles/Administration/Sitecore%20Performance/Optimizing%20Sitecore%206%20and%20later/Optimizing%20Performance%20in%20Sitecore.aspx -->
<PerformanceCounterConfiguration counterSpecifier="\Sitecore.Caching(*)\CacheClearings" sampleRate="PT1M" />
<PerformanceCounterConfiguration counterSpecifier="\Sitecore.Caching(*)\CacheHits" sampleRate="PT1M" />
<PerformanceCounterConfiguration counterSpecifier="\Sitecore.Caching(*)\CacheMisses" sampleRate="PT1M" />
<PerformanceCounterConfiguration counterSpecifier="\Sitecore.Data(*)\Data.ItemsAccessed" sampleRate="PT1M" />
<PerformanceCounterConfiguration counterSpecifier="\Sitecore.Data(*)\Data.PhysicalReads" sampleRate="PT1M" />
<PerformanceCounterConfiguration counterSpecifier="\Sitecore.Data(*)\Data.PhysicalWrites" sampleRate="PT1M" />
<PerformanceCounterConfiguration counterSpecifier="\Sitecore.System(*)\Exceptions.ExceptionsThrown" sampleRate="PT1M" />
This caught me out for a while, I was missing the (*).
There is a handy cmd you can run to see all the counters available:
typeperf -q -o "C:\Temp\counters.txt"
Moving to the cloud offers many new opportunities for how you store and handle your data. One challenge we ran into was how to migrate a large Sitecore web db into a PAAS sql azure db.
There are quite a few tools for doing this, even SQL management studio can help. The problem we faced was the source db had several filegroups setup, SQL Azure doesn’t support this.
To get around it we first created the DB through the portal, then scripted in the db structure – follow the steps in this link: http://blog.sqlauthority.com/2010/06/04/sql-server-generate-database-script-for-sql-azure/
The second step was then the data. We used the Azure MW tool (https://sqlazuremw.codeplex.com/) . When selecting what to migrate, select ‘data only’. This can take a while to run, several hours in our case, but at least got the data upto the cloud.
Azure offers some very useful out the box tools for pooling diagnostic information into blob and table storage. http://azure.microsoft.com/en-gb/documentation/articles/cloud-services-dotnet-diagnostics/ contains lots of information on this.
In your solution you should end up with a file: diagnostics.wadcfg under the roles folder. One issue we ran into occurred when we changed the name of the counterpart project, the diagnostics file then sat in the wrong place. Ensure the structure is as follows:
-- files for your site
-- Roles (default folder name)
--- WebProject (folder named the same as your webproject)
You can verify this by checking the Azure project ccproj file. You should see:
<Folder Include="WebProjectContent\" />
When the site fires up you should then see blob storage contain a folder called: wad-control-container. If you have a look in there it should contain a file that mimics the content of your diagnostics.wadcfg file.
I found it was useful to clear out this folder before testing new deployments.
One feature Azure offers for getting your boxes configured is the notion of startup tasks – I won’t go into too much detail here as there is lots available online e.g. https://msdn.microsoft.com/en-us/library/azure/hh180155.aspx
As part of setting these up I thought I’d share a few tips / gotchas that caught me out when running powershell from the cmd tasks.
My solution setup was:
So then I’d reference in the task:
<Task commandLine=”StartupScripts\script.cmd” executionContext=”elevated” taskType=”simple” />
Nothing rocket science so far! So, why didn’t the script work? I could jump on the box and run the cmd and it would be fine.
How to debug the process?
I found the most useful way was to add markers from the cmd and the ps1. The cmd file looked like:
echo "Running" > "c:\test\log.txt"
powershell -command "Set-ExecutionPolicy Unrestricted" 2>> "c:\test\log.txt"
powershell .\startupScripts\script.ps1 %~dp0 2>> "c:\test\log.txt"
EXIT /B 0
Note, the .\startupScripts part of the ps1 path is v important!
Then the powershell:
$tempFolder = "c:\test"
$logFile = "ps_log.txt"
$siterootFolder = "\sitesroot\0\"
If (!(Test-Path "e:\"))
$siterootFolder = "f:" + $siterootFolder
$siterootFolder = "e:" + $siterootFolder
param( [string] $text )
$path = [System.IO.Path]::Combine("$tempFolder\$logFile")
$fs = New-Object IO.FileStream($path, [System.IO.FileMode]::Append, [System.IO.FileAccess]::Write, [IO.FileShare]::Read)
$sw = New-Object System.IO.StreamWriter($fs)
Write-Log "Output folder is $siterootFolder"
Note, if you try to write to log.txt you will get process locked exceptions as the cmd holds locks on the file.
There are all sorts of techniques for writing to a file, this example uses a StreamWriter. Hit up google for different examples.