Running Microsoft SQL Server in Azure Virtual Machines – Options and Decision Points

How do you choose the best deployment options for your SQL Server environment running in Azure Virtual Machines?   Here are some tips which should hopefully help those that have chosen this well trodden route.

This has been written to help with key decision points for implementation when considering using VMs running in Azure to host SQL Server.

Why Run SQL Server In Azure Virtual Machines At All?

There are many reasons you may want to run SQL Server using VMs in Azure. A handful of reasons many organisations are choosing this approach are provided here :-

  • You need some compute capacity which is not available on premises or is not available in a timely manner.
  • You need full control of the SQL Server instance or require custom add-on tools or applications running on the server.
  • You need to spin up a SQL Server instance for test / development purposes in a short timescale.
  • You need to spin up a production environment in a short timescale!
  • You want to pay as you use, and for what you use rather than invest up front in hardware and its implementation.
  • SQL Azure (PaaS) may not provide the control or features you require such as Windows Authentication for example.
  • You may want to use Azure as a disaster recovery location for your on premises SQL Servers.
  • Depending on your current environment you may want a more reliable, secure and scalable platform to host your infrastructure.

SQL Server Azure Virtual Machine Deployment Options

Here are some options available for running SQL Server in virtual machines in Azure IaaS.



Azure Virtual Machine SQL Server Gallery Options

We can see below some of the SQL Server template options provided in the gallery as well as the MSDN option on the bottom left.



Choosing the Right Deployment Option

The first thing to do is review the SQL Server Edition requirements and related costs followed by purpose and timescales.

SQL Server Edition and Licensing

A feature comparison of the different Editions of SQL Server can be found here  :-

SQL Server 2014
SQL Server 2012

For example if you need the advanced BI features, OR AlwaysOn high availability , OR SSRS scale out farm capability you will need Enterprise Edition. Also note that SQL Server Standard Edition (2014) is limited to 128GB of RAM so if you need more than this look at Enterprise.   These are just some feature examples.

Look at the number of users that will need a license and calculate the costs of CALS if using Standard or BI Edition.

Duration of Usage for the SQL Server VM/s

Long Term Production Use

If the environment is to be for production and long term, review the costs of bringing your own license with your license reseller, compared to using the pre-built templates using the Azure Calculator which can be found here :-

Short Term Use

If the environment is to be used for the short term OR if there is a preference for pay as you go then you may choose to use the template virtual machines which include the SQL Server licenses and are quick to provision. You may also choose this option if you do not want your team or yourself to install SQL Server as well as prefer to have Azure provide the SQL licensing so all costs are bundled into one billing provider.

Configuration Options

Ok so by now hopefully you know which license model, SQL Edition and gallery image (or VHD upload) you require. You now need to ensure you have the correct configuration for the VM and its supporting infrastructure such as storage and networking.   I’ll provide high level details here only.


This includes things like what type of network, network IP addressing, subnets, IP address assignments, DNS, security options, load balancing, geographic location etc.  You will also need to consider VPN types and options if required. Full Azure network documentation is provided here for reference –



Where will the storage be located geographically ? Storage container names, local or geo-replicated, container structure.  Microsoft recommend local storage rather than geo- replicated storage for SQL Server for best performance.


High Availability

Does your VM need to be highly available? If so then you will need two in an availability set.  You will also need to architect database high availability using either Mirroring or AlwaysOn Availability Groups.   Failover Clustering for SQL Server is also not yet supported.

An example of an availability set is shown below where essentially the SQL Server VMs run on different racks (fault domains) and different update domains.  This ensures Microsoft provide a 99.95% uptime SLA on machines in the availability set.

VM Sizing

If you need a high performance virtual machine then use the D Series (or even G Series) virtual machines in order to make use of the higher spec processors and SSD storage, as well as larger VM CPU and RAM options.

Tip: When creating your SQL Server use a D Series VM size to start with and then drop back to an A series after creation if you need to.

This allows you to easily switch between all D and A Series VMs without having to recreate the VM.  By recreate, I mean removing the virtual machine and recreating it using existing disks. I don’t mean rebuilding the entire server/s.

Here is a table of the D Series VMs as an example. You can see the amount of (temporary!) SSD available with each size in the last column “Disk Sizes”.  Taken from –



Follow the performance guidelines here which I will also cover in the near future with scripts and screenshots of the elements mentioned within it :-


Standard Versus Basic VMs

Azure Virtual Machines can be either Standard or Basic and the configuration can be changed at any time through the GUI or PowerShell (although the machine will shutdown and restart if changed).

Basic VMs are about 20% lower in cost but do not have the option to autoscale or load balance and also have  limited lower spec VM sizes compared to standard.  The other main difference is that the basic machine OS disks are limited to 300 IOPS whereas standard VMs are 500 IOPS.  Basic VMs are perfect for small servers that do not require these features or disk performance.

  • Here is the cost of an A3 Standard VM per month using PAYG prices as of 4th March 2015.   The screen shots also show the available VM sizes in with each option (Standard or Basic).


  • Here is a Basic VM cost for the same size server.




As you can see there are many factors and decision points to consider even after you have decided to use Azure Virtual Machines for SQL Server.  I hope this post helps provide a structured method and some useful information for those about to carry out an implementation of SQL Server in Azure.  Put a comment below on your deployment options and reasoning as it would be interesting to see other peoples decisions.  Also if you feel I have missed anything that I can add to help others do let me know and I can update, and give full credit of course !



The next post will look at performance guidelines for SQL Server running in Azure Virtual Machines to make sure your SQL Servers are optimally configured. Until then.  Farewell !


Installing SQL Server 2014 (RTM) on Windows Server 2012 R2

I am writing this as an update to my previous SQL Server 2012 blog and a reference for installing the above.  This post focuses on a manual GUI installation with some scripts so you can see the steps. I’ll do an unattended one in the future.

The environment is made up from Hyper-V virtual machines and this is a standalone installation with local service accounts which can be changed to domain accounts (recommended for production installations).

Preparation Tasks

Install the .Net Framework components and open up the SQL Server Firewall ports.

From an elevated command prompt run the following script to open up the default SQL Server DB engine and SSAS  firewall ports.

netsh advfirewall firewall add rule name=”Open SQL Server Port 1433″ dir=in action=allow protocol=TCP localport=1433

netsh advfirewall firewall add rule name=”SQL Server Analysis Services inbound on TCP 2383″ dir=in action=allow protocol=TCP localport=2383 profile=domain

Run the following to install the .net framework components.  ensure the sourcesz\sxs folder exists, it may not. Copy from Windows 2012 R2 source media or use a network location.

Install-WindowsFeature –Name NET-Framework-Features –Source C:\Sources\sxs

Reboot the server.  It’s a good idea to run Windows Update after this step to update the .Net Framework components.


Once the server has been rebooted and Windows updates have been applied run setup.exe from the SQL Server 2014 media and select Installation from the left of the screen.


Select New Installation from the top right.


Select Next on license screen, I have selected Evaluation.


Accept the license agreement.


A new option appears here to use Microsoft updates if you would like to. I am selecting no as I would patch SQL Server manually at the right time, after testing.


Run the product updates check which will look for updates to the setup files.  I am not running it here as I am installing it on the day it came out !  I would recommend you run this for production installs.

Run the setup rules checks.  I have a warning to tell me that the firewall is on and to make the necessary changes which we have already done above.   The .NET check will also give a warning if the computer does not have internet access.  It is safe to continue.


Select Features.  Select the top option for a base installation of the SQL Server database engine..


I am adding more features than needed here as I am using this installation GUI to generate my configuration file which will populate all the options I may need going forward.  Then I can remove the features I don’t need in future installations.  Select the options you require.,


image  image

Note I have also used the C Drive here.  You can change this to install on a different drive which would be recommended where possible.

I am using the default SQL Server instance settings as I do not want to create a named instance.


I am also leaving the service accounts as local accounts as I can change them after the installation or in the scripts where necessary.


Check the collation is correct for your region, the screen below is fine for the UK which shows that SQL will be case insensitive and accent sensitive using code page 1252.   Ensure your OS region settings are correct too.


Database Engine Configuration

Here we can add administrative accounts, set authentication methods and data file locations etc.  I have added the current user who is the local administrator.  Add domain groups that will administer SQL Server here.

I have left the default Windows authentication setting here as this should be used whenever possible.


I have specified non default directories for the installation of data files.  This is to provide an example of how you might set up a production environment which has separate volumes for different data files.


I have reverted all back to C:\ after the screenshot.  You can also enable FILESTREAM here if you wish.  Or it can be enabled later using the SQL Configuration Manager tool.

Analysis Services Configuration

Specify SSAS user, cube type and data files.  I have left Multidimensional (default).  This is an option that can not be changed after installation.  However you can add a Tabular instance as well or remove the Multidimensional instance and then specify Tabular.


Data files.


Review summary and configuration file path.


I will take this configuration file now from this server to be used in future scripted installation.


Complete the installation by clicking Install in the Setup wizard.

Wait for installation to complete.


Successful installation reported after about 6 minutes on my VM running on my laptop..


Now check your start screen and you won’t see any SQL icons.  Really useful I know.


Type SSMS and the icon will appear, right click it and add to start.


Once its on the start screen you can pin to task bar too if you like.

Type SQL Server to add other tools such as the installation center, Data Services, SQL Profiler, Config Manager etc. If you can’t find anything for example using SQL Server then type Config for config manager for example.   Here are some of the icons I have added to my start screen.


Click on SQL Management Studio and you should see a fully working SQL Server 2014 environment. My server name below is Server-2012 which relates to the server OS.


I hope this has been helpful to give you some guidance to install SQL Server 2014.

How to Backup SQL Server Directly to Azure Storage

From SQL 2012 SP1 CU2 upwards we have the ability to backup directly to Windows Azure Storage.  So what are the main benefits of doing this and why bother ?

Here are some reasons why it is an important feature, and one that should be considered whether your SQL Server is hosted in Windows Azure or anywhere else.

  • Out of the box functionality to store your SQL Data offsite for DR Purposes with no additional software or agents required.
  • Very simple to get up and running.
  • A good low cost strategy for offsite data backups for DR. Data is replicated in triplicate for high availability within the data centre in which the storage account resides.  Storage accounts can also be created to be geo replicated across data centre regions also.
  • Microsoft provide a guaranteed 99.9% SLA on access to the data to ensure it is available when you need it.
  • Only pay for what is used.
  • Leverage the vast data storage capacity of Windows Azure Storage.

Create a Storage account for SQL Server to backup to


I have chosen West Europe Here as North Europe is my primary Datacenter where my production SQL Servers reside. I have selected to use locally redundant which means data in this storage account will remain in this data center and will be protected in triplicate by Microsoft.

Once the account is provisioned create a container for the backups.



Container is created here.


Click Dashboard and then the Manage Access Keys button at the bottom of the screen.



Click the clipboard icon (I used the secondary access key).


Once copied into the buffer, open a new query windows in SQL Server Management Studio


In the new Query Window Enter Create a New credential and run the backup.

Here we see I have pasted the key copied from Azure to my SQL Query. Entered the name of my storage account “mitsqlbackups” Then specified the database to be backed up and the full path of the backup file wchih includes the container we created earlier (mysqlbackups).  I have also enabled compression here to help reduce the size of the backup file at rest.


Code pasted here, feel free to add your own comments.  I have highlighted in red the text that you would need to change for your environment.

DROP CREDENTIAL mycredential

IF NOT EXISTS(SELECT * FROM sys.credentials
WHERE credential_identity = ‘mycredential‘)
CREATE CREDENTIAL mycredential WITH IDENTITY = ‘mitsqlbackups
,SECRET = ‘wORKivaVUFo8HxJ…..copythiskeyfrom azure, storage account dashboard‘;

TO URL = ‘’

We can see in the Azure Portal that the backup has indeed been created successfully.


The backup to Azure feature is only supported through script at this time and not through the GUI.  Maybe this will change in SQL 2014.  I’ll be checking when it RTMs next week.  So you can script your SQL backups using the credential that is already created.  For example running the following code will run the backup from script without needing the Azure Storage Access Key which is now stored and encrypted within SQL Server.

TO URL = ‘’

A second Backup has also been created now from the script above.



The above tasks can be done via T-SQL, PowerShell or C#. Please see for full syntax and examples on these options as well as difference backup options such as differential, log backups Etc.


Restore a Database Directly from Azure Storage

Here I have restored the database we backed up earlier.  Yes Master was a bad example ! Smile.  And so I have restored it with a different name in a different file location as an example.

WITH CREDENTIAL = ‘mycredential’
,MOVE ‘master’ TO ‘F:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\Master2.mdf’
,MOVE ‘mastlog’ TO ‘F:\Program Files\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\DATA\Master2.ldf’
,STATS = 5


You could also click the download button of course if you wanted to download the backup file quickly and manually.



Well I hope someone finds this useful and shows that it is in fact very easy to point SQL Backups at Azure storage which is highly available and can be used for DR purposes without additional software or hardware.

Cost Illustration

Using the Azure Cost Calculator (March 2014) we can see the cost of storage 1 Terabyte of locally redundant data in Azure costs £25.28 with a 12 month commitment plan.  Pay as you go price is £32.62.  A Windows Azure Enterprise Agreement should reduce the £25.28 further.  Check with your license reseller for exact pricing, these prices are given as an illustration from the guide prices provided on the Microsoft website using the full calculator available here,



Some Best Practice Guidance for using SQL Backup to Azure

  • Use Backup Compression to reduce cross data center data costs and speed up backup times.
  • Use a unique file name for every backup to reduce risk of overwriting backups accidentally.
  • Set access level on the storage container as private so only authorised access is granted !
  • Use same region as SQL VM if already in Azure and you do not need to backup to a different data center to avoid charges and for best performance of backup and restore times.


Error Message 3271 Returned “A nonrecoverable I/O Error occurred on file HTTPS://…..

Ensure the backup credential Identity is the same as the storage account name.

Syntax is here:-

IF NOT EXISTS (SELECT * FROM sys.credentials WHERE credential_identity = ‘mycredential’) CREATE CREDENTIAL mycredential WITH IDENTITY = ‘mystorageaccount’ ,SECRET = ‘<storage access key>’ ;

Here the mystorageaccount should be mitsqlbackups as this is the name of my storage account.




SQL Server Backup and Restore with Windows Azure Blob Storage Service

More Best Practices and Troubleshooting

Do we need IT Service High Availability and Disaster Recovery ?

Hello today I am writing a small article on the huge subject of high availability (HA) and Disaster Recovery (DR) and why you should consider them to provide highly available IT services and minimise downtime for your users. 

I’ll cover brief definitions and examples of High Availability (HA) and Disaster Recovery (DR) planning, and explain their differences.  I feel this is important as many people often refer to HA and DR interchangeably and sometimes in the wrong context.

I talk to many clients and help to architect highly available systems, mostly based on Microsoft technologies such as Windows, SharePoint and SQL Server and many common questions and discussions come about regarding HA and DR.  I thought it would be good to get this down into a high level article to help others and for me to reference it when talking about these things with others. 

What is High Availability (HA) ?

In the world of IT infrastructure, a service that is highly available is considered to be resilient from component failure at the site level, where a site would be a data centre.  Resilience here means that the service would have little or no downtime if any one component in the infrastructure hardware or software stack failed.   A component could be defined from anything from a power supply in a server, a motherboard, a hard disk or a network switch at the hardware level. This then bubbles up to the software level in the guise of the operating system and then up to the user application or/and any other software that may be present such as drivers or backup software.

The availability of any service can be made resilient in a number of ways, both at  the hardware & software level.

Here are some examples :-

Component Level Resiliency


The above table gives a rough idea on how we can architect some level of high availability.  The application row is very high level as application resiliency depends on how the application is written and if it is cluster aware for example, as some applications can not be added to a cluster effectively.  

An example for SharePoint would be to use two or more servers in what would be termed as a “farm” and use network load balancing (NLB) to distribute network traffic to the most responsive server for example.  For Microsoft SQL Server we could create a cluster as SQL Server is cluster aware. Well the Database engine, Analysis Services and SSIS (to a degree) are.  For Reporting Services we would use the farm scenario like we have in our SharePoint example, with NLB.

So now we have a highly available service in our data centre which is running on highly available hardware with multiple servers being fed from multiple power sources and internet links.  In effect we are protected against any single component failure and we can automatically failover to another physical host, virtual server, power supply, or internet feed for example.  We may want to control this so failover could be manual for some or all components.  This is where monitoring comes in but that’s a subject for another day !

A Highly Available Virtual Host Cluster

In order to provide our virtual machines (VMs) some high availability and protection from physical hardware and potentially power and network link outages we could create a cluster.  The principle of a virtual host cluster is the same in Hyper-V as it is for VMware in that the hosts can support multiple VM’s which can automatically move (fail over) to another host if required.

Here we see a highly available three node Hyper-V cluster which can host a number of virtual machines.


The blue machines represent general VMs and the red one represents a mission critical virtual machine that must be kept alive at all times.

The cluster is designed to be highly available in that there are multiple hosts with multiple connections to the various networks required such as heartbeat, Live Migration, iSCSI SAN for storage and the LAN.  Any network switch, network card or port can fail and the system will continue to run.  A host could also fail and any VMs which are marked for high availability would be automatically moved to another host server without any downtime (host capacity allowing).  Assume the iSCSI SAN storage is provided by a highly available storage system with RAID disk configurations to help protect against disk failure.

So we are fully protected against failure of any physical server, network and storage components.

So why do I need to use clustering or NLB when my environment and VM is already highly available ?

The answer to this common question is that there are several good reasons to protect your service further and beyond the physical level mentioned just above.  The short answer will come if you think of a blue screen, OS freeze or maintenance.

These reasons are listed here with a bit more detail:-

Application Level High Availability through NLB or Clustering


So are we done ? Our application or service is now highly available and nothing can possibly take down our application…. or can it ?

Well if you have a highly secure solid and resilient data center where power, cooling, access etc. is all good then almost all is well and the chances of service outages are reduced significantly.  However as we live in the real world there CAN be things that can go wrong at the data center level.  This leads us into disaster recovery.

Disaster Recovery

Disaster recovery is required to protect services from a disaster at the data center level.  The process and planning of bringing services back online in the case of a disaster is referred to as Business Continuity.  As in, if your business depends on certain services, will these services be available in the event of a disaster in order for the business to continue to function?   Other factors to consider in business continuity are things like how will users be notified ? If the main offices are no longer accessible where will be people work from and so on.   The most likely cases which would require invocation of a disaster recovery plan are things like natural disasters such as floods, fires or even a malicious attack or a UFO, meteorite or plane crash into the data center.  Some of these things are more likely to happen than others ! Smile 

In the event of a disaster, services would resume from the DR site ideally without too much or any disruption to the users of the services.  There may well be manual system administration tasks to complete in the case of a DR scenario such as IP address configuration, SAN storage and cluster configuration to bring up services.  As technology evolves these manual steps get simpler, the effort required AND recovery time is also reduced.   An example of this is Microsoft’s Hyper-V Replication which is designed for a DR scenario. Hyper-V Replication allows the administrator to configure a separate IP address for the DR site when normally this may have been done by using additional scripts or manual steps.

Other reasons for having DR would be to gain confidence from your clients and also to prove to auditing bodies that your organisation can continue in the event of a disaster.   The benefits of having a GOOD DR process and architecture is that DR can be tested properly without or with minimal disruption (out of hours please !) to consumers of a service.



Having a good HA and DR process in place for your mission critical systems provides your business with the confidence to perform value adding activities required for the organisation to be successful.  The outcome of this may be better customer satisfaction, more profit or even saving lives in some cases.   HA and especially DR require proper planning and investment and some example questions to ask when considering a HA and DR strategy are “Why do we need it ?”, “What does it cost us if there is an outage ?”, “Whose responsibility is it if x service is down for x amount of time ?”, “How long can we function without the service?”.

Most importantly I think it is essential to think about “What would happen if we did nothing ?”. For example will you lose more money if the service was down for 1 day than you would save in the implementation of a good HA and DR strategy ?   Would you lose your trading license or lose customers ?

Well it is a very broad topic and I hope I have highlighted some key or interesting points for you to take away.  Please do comment on the post if you found this useful or if there are other important points which could help others.  And thanks for reading. I know it’s a long article this time…

STOP ! Read this SQL Server Pre Deployment Checklist Before….

SQLChecklistThere are many things to check BEFORE implementing an Enterprise or even a medium scale SQL Server based infrastructure.  Below are some key points to consider before deploying SQL Server in a production environment to help prevent a whole lot of pain once a system has been deployed and  is in production.  This post relates to general pre-installation checks rather than the logical and physical architectural decision making process for the implementation Microsoft SQL Server 2005 through to SQL Server 2012 SP1+.

image  Hopefully this article will help some people to avoid learning the hard way !

There are many reasons for following a checklist and ensuring the environment is correct. Examples are recording a snapshot of the current configuration and benchmarking performance. The most important reason to use a checklist and get things right the first time is that it is much simpler to get things right pre-build rather than once the system has gone into production and WHEN problems arise.  Note the missing IF statement here.


Pre-Installation Checks

Server Level

Firmware Ensure BIOS, RAID controller and HBA firmware is up to date. 
Remove any unnecessary software

If the server is not a fresh build, remove any unnecessary software such as old and unused SQL Server versions or components, any agents that may no longer be required or need updating,


Operating System and Server Environment

Ensure environmental aspects are considered such as AD, DNS, server configuration and health etc.
Windows Server Edition

Ensure you have the correct Edition for current and future requirements. E.g. note memory limitations in Windows Standard Edition (32GB). If you have a license to use Enterprise then that maybe a better option to start with.  Only relevant to Windows 2008 and 2008 R2.  Note in Windows 2012 there are NO differences between Standard and Datacenter, only virtualization rights  There is no Windows Server 2012 Enterprise Edition.

Windows patch level

Ensure the server is up to date on patching and that an update strategy is defined and implemented.

Hardware Drivers

Ensure all drivers are up to date especially storage, network and HBA drivers.

Windows OS Regional Settings

Ensure the Region settings of the OS are correct for the country of use. I have seen many installations in the the UK for example, set with the default region as US.  Once SQL is installed with defaults, the collation will be incorrect for the region it is being used in, resulting in incorrect date/time configuration etc..

Consider using DNS A records for the server name

Consider the use of a DNS A record for application access to the SQL Server.  This way if a database, set of databases or a whole instance needs to be moved to a new server it can be done with ease by changing the DNS record in AD without making changes to applications and reports for example.  A good example of this is Reporting Services or SharePoint.   Yes you can use SQL Aliasing too which may require a bit more configuration. e.g. for SharePoint you would have to create an alias on each SharePoint Server. With a DNS alias you change the Alias in one place (DNS!).

Check Group Policies

Check and audit which group policies are being applied to the server to ensure there are no unexpected or unwanted GPOs being applied and vice versa.

Check Event logs for pre existing errors or warnings Ensure event logs are clean. Take special note of the system logs which can highlight potential system and hardware errors or time bombs.
Network Configuration and bandwidth

Is your networking optimally configured ? This includes iSCSI or Fibre Channel links to any SAN that might be in the mix.  Test and benchmark throughput.  Use tools such as CrystalDiskMark or SQLIO. Are you getting expected throughput ? i.e are you hitting 100MB/s max sequential write throughput if you have 2 x 4GB HBA cards ?  Ensure drivers and multi-pathing configuration and software is installed and correct. Note some SANs may not allow Active/Active configuration or may have this as an additional cost option.   If all ok move onto disk storage configuration.

If SQL Service account Permissions. Grant SQL Server permissions to “Perform Volume Maintenance Tasks” to allow for instant file initialisation.  More info here

Also grant the SQL Server the “Lock Pages In Memory” right – only if required.  More on these options will be covered in later blogs.  There are many blogs around lock pages in memory and whether it should be used or not.
by Jonathan Kehayias
from the Windows Server Performance Team.  A lot depends on the environment, it is not generally recommended in virtual environments when there is a risk of a host requiring memory to start up VMs in an outage scenario for example.

SQL Service Account Permissions The SQL Server service account does not require any special permissions.   Any permissions required are granted during installation or when changing the account using SQL config manager.  This is the main reason why you should use this tool for changing accounts and NOT the Windows Admin Services console.


Optimal storage configuration is essential for SQL Server to perform well, especially on larger databases.

Disk Configuration

Ensure your disks are formatted correctly using 64K blocks.  The default of 4k will provide poor performance as SQL stores data pages in 8k chunks which fit perfectly into 64k cluster size (or file allocation unit size),  Each 8k block is split if using 4k blocks.  This can and does cause a lot of fragmentation in the database. This simple configuration can help improve IO performance by up to 30%.  If formatting is incorrect, all data must be moved and disks reformatted once in production.  Disk alignment is no longer a requirement unless you are still using Windows 2003.  If you are using Windows 2003, an upgrade is WELL overdue as mainstream support has ended. Also, Windows Server 2003 will not be capable of taking advantage of modern hardware (i.e. major processor advancements since 2003).  Further info here.

RAID Configuration

Are your RAID arrays optimised for SQL Server and expected usage ? There are many articles explaining the benefits of why RAID groups are important. More info can be found here ….Raid Levels and SQL Server.
Of course modern SANs coupled with tiered storage make RAID pretty much redundant now by using storage pools.  This is essentially using a large pool of disks to create LUNS (Logical Unit Numbers) as logical disks. Correct setup and configuration can provide better performance and use less disk space by removing storage silos compared to traditional RAID.  See here for a great blog by Paul Galjan. Disk and File Layout for SQL Server.

Disk Sizing

Are your disks sized appropriately  to accommodate at least medium term growth expectations ? Do you have capacity to add more storage if required easily ?

SAN Configuration and load

Ensure SAN is not already saturated or at capacity, ensure it can deliver reasonable IOPS for your application to perform as required.  Refer to Microsoft Certified Master and MVP Brent Ozar who provides tonnes of excellent SQL Server information in his blogs.

SAN Software and vendor recommendations Ensure you look at the vendors recommendations for SQL Server as they may differ from standard best practices for a particular SAN. 
SAN – HBA Drivers and software Ensure the correct and latest drivers for any HBA cards are installed and configured correctly.  E.g. Ensure Multi-pathing is set up and working correctly.  Some vendors do not allow Active / Active (for max bandwidth) configuration by default, check with vendor.

SQL Server Specific

SQL Server Version and Edition

Do you have the correctly licensed SQL Server Edition ?  Per core, VM, Standard, BI, Enterprise Edition ?  Ensure latest service pack is applied and license key is correct for use case.

SQL Server Service Accounts

Ensure the service accounts are correctly planned and created in Active Directory prior to the installation. e.g. separate accounts for Reporting Services and Analysis Services for example.  Ensure you have the account details.

SQL Cluster Considerations

If using a Windows Cluster ensure that the storage and resource groups are assigned correctly.  Ensure the quorum method and mode is designed correctly, e.g. disk or file share, node majority options. See the following for Configure and Manage the Quorum in Windows 2012 Clustering
Understanding Quorum Configurations in a Failover Cluster
Consider Cluster Aware updating for patching clusters, especially clusters with many nodes. This feature is only available in Windows Server 2012.

Most importantly ensure you thoroughly test and document the failover process.  You do not want to be looking for documentation or ringing support (Panic…Microsoft or Storage Vendor !!?) in the event of a cluster failure during a production outage.  The reason a SQL cluster even exists is to minimise downtime of critical systems.  You would not be popular if you could not show evidence of testing the failover and failback.

Thank you for reading, I will try and follow this up with a post on what to check AFTER the SQL server installation is complete.

Also check out my post on how to complete a scripted install of SQL Server on Windows Server 2012 Core, which will of course also work with the full GUI shell of Windows Server 2012 .   


Check out the newly released SQL Server community badges above, they are just for fun !

I hope this article is useful to at least a handful of people and spares someone from possible future pain.

Did you find this useful ? Do you have further good checklist tips ? Post a reply to share…

Installing SQL Server 2012 on Windows Server 2012 Core

This post accompanies a video tutorial which is available here


I would like to document and share how to install SQL Server 2012 on a Core installation of Windows 2012. Windows Server Core as I am you may be aware of is a version of Windows Server without the graphical tools and weight of a full Windows Server installation. Some of the reasons for using Server Core are listed here:- 

  • Smaller attack surface as there is less installed and running by default .
  • Less patching and reboots as many services and features such as Internet Explorer, Windows Explorer, start screen are not installed.
  • Smaller disk footprint as the OS takes approximately 4GB as opposed to around 8GB for the full install.

New in Windows Server 2012 is the ability to install the full GUI if required and then remove it again.  This would usually only be required to install certain drivers or applications that must have a GUI.

The installation is the demo video is done using a Windows Server 2012 Core VM which is then configured through sconfig and joined to a domain.  The server already has some installation files and scripts on the C Drive as part of the SYSPREP image it was copied from.

The scripts used in the video are available in this article. SQL Server and Analysis Services are configured to run using Domain Accounts.

The Process

Prepare any code and scripts up front so they are ready for the installation.  Ensure the steps below are followed for a smooth installation.

1. Enable Remote Management, Remote Desktop, check and configure network IP setting

2.  Join the server to the domain and create the required service accounts in Active Directory.

3.  Ensure the Sources\sxs folder from the Windows 2012 installation media is available or copied to the server. Mine is copied to C:\SourcesSxS

4.  Allow SQL Server through firewall script :-

netsh advfirewall firewall add rule name="Open SQL Server Port 1433" dir=in action=allow protocol=TCP localport=1433

5.  Allow Analysis Services through the firewall :-

netsh advfirewall firewall add rule name="SQL Server Analysis Services inbound on TCP 2383" dir=in action=allow protocol=TCP localport=2383 profile=domain

Here are the SQL scripts used in the demo, and the code for each is found below.


6. Create or download and customise the ConfigurationFile.ini for SQL Server.

This can be done running through the SQL Server setup full GUI on another server and stopping before the installation part and grabbing the ini file from the default location of :- “C:\Program Files\Microsoft SQL Server\110\Setup Bootstrap\Log”

See this link for full details from Technet:-

You can then customise this file as you wish and select the components you require, installation type,  service accounts, directory locations.  Note Reporting Services (SSRS) is not supported on Server Core.  However SSIS is.

Here is the ConfigurationFile.ini from my setup for you to compare yours against or just copy this one !

Add SSMS, ADV_SSMS to the Features line to add the management tools.

Code Snippet
  1. ;SQL Server 2012 Configuration File – Mitesh Chauhan 2012
  2. [OPTIONS]
  4. ; Specifies a Setup work flow, like INSTALL, UNINSTALL, or UPGRADE. This is a required parameter.
  6. ACTION="Install"
  8. ; Detailed help for command line argument ENU has not been defined yet.
  10. ENU="True"
  12. ; Parameter that controls the user interface behavior. Valid values are Normal for the full UI,AutoAdvance for a simplied UI, and EnableUIOnServerCore for bypassing Server Core setup GUI block.
  14. ;UIMODE="Normal"
  16. ; Setup will not display any user interface.
  18. QUIET="False"
  20. ; Setup will display progress only, without any user interaction.
  22. QUIETSIMPLE="True"
  24. ; Specify whether SQL Server Setup should discover and include product updates. The valid values are True and False or 1 and 0. By default SQL Server Setup will include updates that are found.
  26. UpdateEnabled="False"
  28. ; Specifies features to install, uninstall, or upgrade. The list of top-level features include SQL, AS, RS, IS, MDS, and Tools. The SQL feature will install the Database Engine, Replication, Full-Text, and Data Quality Services (DQS) server. The Tools feature will install Management Tools, Books online components, SQL Server Data Tools, and other shared components.
  32. ; Specify the location where SQL Server Setup will obtain product updates. The valid values are "MU" to search Microsoft Update, a valid folder path, a relative path such as .\MyUpdates or a UNC share. By default SQL Server Setup will search Microsoft Update or a Windows Update service through the Window Server Update Services.
  34. UpdateSource="MU"
  36. ; Displays the command line parameters usage
  38. HELP="False"
  40. ; Specifies that the detailed Setup log should be piped to the console.
  44. ; Specifies that Setup should install into WOW64. This command line argument is not supported on an IA64 or a 32-bit system.
  46. X86="False"
  48. ; Specify the root installation directory for shared components.  This directory remains unchanged after shared components are already installed.
  50. INSTALLSHAREDDIR="C:\Program Files\Microsoft SQL Server"
  52. ; Specify the root installation directory for the WOW64 shared components.  This directory remains unchanged after WOW64 shared components are already installed.
  54. INSTALLSHAREDWOWDIR="C:\Program Files (x86)\Microsoft SQL Server"
  56. ; Specify a default or named instance. MSSQLSERVER is the default instance for non-Express editions and SQLExpress for Express editions. This parameter is required when installing the SQL Server Database Engine (SQL), Analysis Services (AS), or Reporting Services (RS).
  60. ; Specify the Instance ID for the SQL Server features you have specified. SQL Server directory structure, registry structure, and service names will incorporate the instance ID of the SQL Server instance.
  64. ; Specify that SQL Server feature usage data can be collected and sent to Microsoft. Specify 1 or True to enable and 0 or False to disable this feature.
  66. SQMREPORTING="False"
  68. ; RSInputSettings_RSInstallMode_Description
  70. RSINSTALLMODE="FilesOnlyMode"
  72. ; Specify if errors can be reported to Microsoft to improve future SQL Server releases. Specify 1 or True to enable and 0 or False to disable this feature.
  76. ; Specify the installation directory.
  78. INSTANCEDIR="C:\Program Files\Microsoft SQL Server"
  80. ; Agent account name
  84. ; Auto-start service after installation.  
  88. ; Startup type for Integration Services.
  90. ISSVCSTARTUPTYPE="Automatic"
  92. ; Account for Integration Services: Domain\User or system account.
  94. ISSVCACCOUNT="NT Service\MsDtsServer110"
  96. ; The name of the account that the Analysis Services service runs under.
  100. ; Controls the service startup type setting after the service has been created.
  102. ASSVCSTARTUPTYPE="Automatic"
  104. ; The collation to be used by Analysis Services.
  106. ASCOLLATION="Latin1_General_CI_AS"
  108. ; The location for the Analysis Services data files.
  110. ASDATADIR="C:\Program Files\Microsoft SQL Server\MSAS11.MSSQLSERVER\OLAP\Data"
  112. ; The location for the Analysis Services log files.
  114. ASLOGDIR="C:\Program Files\Microsoft SQL Server\MSAS11.MSSQLSERVER\OLAP\Log"
  116. ; The location for the Analysis Services backup files.
  118. ASBACKUPDIR="C:\Program Files\Microsoft SQL Server\MSAS11.MSSQLSERVER\OLAP\Backup"
  120. ; The location for the Analysis Services temporary files.
  122. ASTEMPDIR="C:\Program Files\Microsoft SQL Server\MSAS11.MSSQLSERVER\OLAP\Temp"
  124. ; The location for the Analysis Services configuration files.
  126. ASCONFIGDIR="C:\Program Files\Microsoft SQL Server\MSAS11.MSSQLSERVER\OLAP\Config"
  128. ; Specifies whether or not the MSOLAP provider is allowed to run in process.
  132. ; Specifies the list of administrator accounts that need to be provisioned.
  134. ASSYSADMINACCOUNTS="DEMO\administrator" "DEMO\ssassvc"
  136. ; Specifies the server mode of the Analysis Services instance. Valid values are MULTIDIMENSIONAL and TABULAR. The default value is MULTIDIMENSIONAL.
  140. ; CM brick TCP communication port
  144. ; How matrix will use private networks
  148. ; How inter brick communication will be protected
  152. ; TCP port used by the CM brick
  156. ; Startup type for the SQL Server service.
  158. SQLSVCSTARTUPTYPE="Automatic"
  160. ; Level to enable FILESTREAM feature at (0, 1, 2 or 3).
  164. ; Set to "1" to enable RANU for SQL Server Express.
  166. ENABLERANU="False"
  168. ; Specifies a Windows collation or an SQL collation to use for the Database Engine.
  170. SQLCOLLATION="Latin1_General_CI_AS"
  172. ; Account for SQL Server service: Domain\User or system account.
  176. ; Windows account(s) to provision as SQL Server system administrators.
  178. SQLSYSADMINACCOUNTS="DEMO\administrator" "DEMO\sqlsvc"
  180. ; Provision current user as a Database Engine system administrator for SQL Server 2012 Express.
  184. ; Specify 0 to disable or 1 to enable the TCP/IP protocol.
  186. TCPENABLED="1"
  188. ; Specify 0 to disable or 1 to enable the Named Pipes protocol.
  190. NPENABLED="0"
  192. ; Startup type for Browser Service.
  196. ; Specifies which account the report server NT service should execute under.  When omitted or when the value is empty string, the default built-in account for the current operating system.
  197. ; The username part of RSSVCACCOUNT is a maximum of 20 characters long and
  198. ; The domain part of RSSVCACCOUNT is a maximum of 254 characters long.
  200. RSSVCACCOUNT="NT Service\ReportServer"
  202. ; Specifies how the startup mode of the report server NT service.  When
  203. ; Manual – Service startup is manual mode (default).
  204. ; Automatic – Service startup is automatic mode.
  205. ; Disabled – Service is disabled
  207. RSSVCSTARTUPTYPE="Automatic"

7.  Run the .NET Framework 3.5 Feature Installation from PowerShell.

The following script enables non signed code to run, then installs the .NET Framework Feature and tells Windows to find the source files in the C:\Sources\sxs folder.

Set-ExecutionPolicy RemoteSigned -Force

Install-WindowsFeature –Name NET-Framework-Features –Source C:\Sources\sxs

8.  Insert Media (if not already inserted or copied locally). Run the SQL installation command or batch file.

The text used in mine is here (carriage returns added for readability):

d:\setup.exe /SQLSVCPASSWORD=P@ssword123 /AGTSVCPASSWORD=P@ssword123 

/ASSVCPASSWORD=P@ssword123  /configurationfile=c:\Scripts\configurationFile.ini /IAcceptSQLServerLicenseTerms

9.  Wait for the install to complete and check the log files. 

The installation log files will be found in in “C:\Program Files\Microsoft SQL Server\110\Setup Bootstrap\Log”. Reboot (recommended) and test connectivity when setup has completed.

10.  Run the Mofcomp command or script to allow remote configuration of SQL Server Manager. The script is simply mofcomp sqlmgmproviderxpsp2up.mof from the “C:\Program Files (x86)\Microsoft SQL Server\110\Shared” directory


Cannot access the SQL Server from a remote server.

First check networking with ping etc. Check IP Addressing, DNS, subnets and LAN/VLAN config.

Once the above is checked the problem is most likely to do with firewall settings on the Windows Core Server. You can run the Coreconfig FirewallSettings.PS1 from the Powershell command line. Download from codeplex (link below) and copy to the virtual disk. You can do this by copying across the network or shutting down the VM and double clicking the VHDX file which will open up the drive to allow you to copy files to and from the VHDX.  Remember to Eject it when finished.

Download Core Configurator from here

The Firewall settings should look something like this.


This should enable remote connections correctly.

Alternatively turn off the firewall which can be done with the same tool, or Group Policy for example.

Cannot connect to Analysis Services

Again this is most likely to be a firewall issue. Run the following command to allow access to the Analysis Services default port.

netsh advfirewall firewall add rule name="SQL Server Analysis Services inbound on TCP 2383" dir=in action=allow protocol=TCP localport=2383 profile=domain

WMI Errors

When connecting remotely via Computer management you may get WMI errors.

See the following link

Cannot connect to SQL Server Configuration Manager Remotely

Error :- Cannot connect to WMI provider.You do not have permission or the server is unreachable. Note that you can only manage SQL Server 2005 servers with SQL Server Configuration Manager.
Access is denied.

The fix for this to run the following command from the correct directory shown below.

Mofcomp sqlmgmproviderxpsp2up.mof



Cannot connect to SQL Server as dynamic ports are used for a named instance.

Add exceptions for the SQLSERVR.EXE file in the C:\Program Files\Microsoft SQL Server\MSSQL11.<instance_name>\MSSQL\Binn directory. Also ensure the SQL Browser service ports are unblocked. to do this enter:-

netsh advfirewall firewall add rule name="SQL Server Browser Services inbound on TCP 2382" dir=in action=allow protocol=TCP localport=2382 profile=domain

Fort further SQL Server firewall related configuration steps please see the following links:-

Configure the Windows Firewall to Allow SQL Server Access

Configure the Windows Firewall to Allow Analysis Services Access

Thanks for reading !


Please let me know if you found this article useful and if you use or plan to use Windows Core with SQL Server and if there are any interesting findings you would like to share.