Select Page

Move to Cloud – Quick Guide for SharePoint Migration

Author: Rajan Todankar | | November 15, 2017

According to the current Market trend, 85% of the Enterprise had multi-cloud strategy in the year 2016. This number is increasing as we move ahead in time. Among the ones on cloud, 42% are on the public cloud and the remaining 68% prefer their own private cloud.

The cost involved in procuring and implementation of a private cloud is too high. Hence, many enterprises prefer a public cloud to save on infrastructure and implementation cost.

Apart from procuring the cloud infrastructure, the enterprises have to incur an additional cost of procuring a Migration Tool. There are many Third-Party Migration tools available in the market which allows to migrate on-premise SharePoint Environment to the cloud. However, they charge for the migration based on the amount of data that needs to be transferred and the number users. If the data is huge with large number of users involved, then the cost involved in utilizing the Migration Tool is more than the infrastructure cost.

In such cases, Microsoft has provided an Out of the Box method in which all the data can be moved to cloud without involving any Third-Party tool which will save the organization on the additional cost involved.

Microsoft, along with the migration, has also provided with planning tool for Migration. Microsoft FastTrack helps us to plan the migration projects based on the success stories. Once created, you can utilize the plan for various future migrations which will help you to improve in precision and accuracy every time you perform a migration.

Moving to cloud using Microsoft’s Out of the Box method is very easy. We just need a Windows machine connected to the on-premise network and a little knowledge on Windows PowerShell and Microsoft Azure. There is no terminology coined by Microsoft for this method hence we will call it as “The PowerShell Way.

The PowerShell Way – In Action


  1.    SharePoint Online Management Shell
  2.    Office 365 Subscription
  3.    Microsoft Azure Subscription

There are multiple steps which needs to be executed in the sequential manner given below to migrate the on-premise data to SharePoint.

    1.    Create a Migration Package:
  • a. Create a Migration Package from File Share:  Migration Package needs to be created using SharePoint Online PowerShell command as below:
New-SPOMigrationPackage -SourcePath \\fileserver\share\folder1 -OutputPackagePath d:\MigrationPackages\Folder1_SrcPkg -TargetWebUrl -TargetDocumentLibraryPath "Shared Documents"


The Parameters used for this commands are as follows:

  • SourcePath – Path of the files which needs to be migrated
  • OutPackagePath – Path of the Output file of Migration Package           

               b. Create a Migration Package from SharePoint On-premise content:  Migration Package needs to be created for SharePoint On-premise content using the SharePoint Online PowerShell Command as below:

Export-SPWeb -Identity "http://On-PremiseSPSite" -ItemUrl "/OnPremiseDocLib" -Path "C:\SPOnPremiseExport" -NoFileCompression -IncludeVersions All


The Parameters used for this commands are as follows:

  • Identity – URL of the On-Premise site
  • ItemURL – Relative URL of the document library
  • Path – Path where the Output file will be saved
  • NoFileCompression – Instructions to not compress the output file
  • IncludeVersions – Instructions to include number of version of the documents in the document library.
  1.    Convert Migration Package:  Re-map a previously created file share based package or exported SharePoint package to accurately describe objects in a target web using the below PowerShell command:
ConvertTo-SPOMigrationTargetedPackage -SourceFilesPath \\fileserver\share\folder1 -SourcePackagePath d:\MigrationPackages\Folder1_SrcPkg -OutputPackagePath d:\MigrationPackages\Folder1_TgtPkg -TargetWebUrl -TargetDocumentLibraryPath "Shared Documents"


The Parameters used for this command are as follows:

  • SourceFilesPath – Path of the source Migration Package files
  • SourcePackagePath – Path where the package’s source metadata files exist.
  • OutputPackagePath – Path where the transformed package metadata files will be saved. If the directory does not exist, then it will be created.
  • TargetWebURL – URL of the SharePoint Online site (Destination site)
  • TargetDocumentLibraryPath – Name of the document library on SharePoint Online site
  1.    Upload the Migration Package to Azure Blob Storage:  Create Azure containers, upload migration package files into the appropriate containers, and snapshot the uploaded content.
  • Log in to Azure Portal at
  • Browse to Storage à Create a new Storage account
  • Create two containers in the new storage account viz. migration files and migration package

Use below PowerShell command to upload the Migration Package:

$azurelocations = Set-SPOMigrationPackageAzureSource -SourceFilesPath "\\Server\ourcepathfilefolder”\SPOTEMP" -SourcePackagePath "C:\MigrationPackages\SPOMigrationPackage" -FileContainerName migration-files -PackageContainerName migration-package -AccountName nO365mig -AccountKey "REAPLCE_WITH_YOUR_AZURE_STORAGE_ACCOUNT_ACCESS_KEY"


The Parameters used for this command are as follows:

  • SourceFilePath –The directory location where the package’s source content files exist.
  • SourceFilePackagePath – The directory location where the package’s metadata files exist.
  • FileContainerName – The name of the Azure Blob Storage container. It will hold the uploaded package content files.
  • PackageContainerName – The name of the Azure Blob Storage container. It will hold the uploaded package metadata files.
  • AccountName –  The account name for the Azure Storage account.
  • AccountKey – The account key for the Azure Storage account.
  1.    Submit Migration Job:  Submit a new migration job referenced to a previously uploaded package in Azure Blob storage into to a site collection. Perform this operation using the below PowerShell command:
 Submit-SPOMigrationJob -TargetWebUrl "https://<SPOSite>" -MigrationPackageAzureLocations $azurelocations


The Parameters used for this command are as follows:

  • TargetWebUrl – The fully qualified target web URL where the package will be imported into.
  • MigrationPackageAzureLocations – A set of fully qualified URLs and SAS tokens representing the Azure Blob Storage containers that hold the package content and metadata files
  1.    Check Migration Job Status:  Optionally, you can also check the status of your running Migration Jobs by using the following command:
Get-SPOMigrationJobStatus -TargetWebUrl https://<SPOSite>



The above demonstrated method has been publicly released by Microsoft to help its users to Migrate to cloud. This is how we can migrate the SharePoint On-premise data to SharePoint Online without using any Third-Party tool and adding to the Project’s expense.

Datavail Script: Terms & Conditions

By using this software script (“Script”), you are agreeing to the following terms and condition, as a legally enforceable contract, with Datavail Corporation (“Datavail”). If you do not agree with these terms, do not download or otherwise use the Script. You (which includes any entity whom you represent or for whom you use the Script) and Datavail agree as follows:

  1. CONSIDERATION. As you are aware, you did not pay a fee to Datavail for the license to the Script. Consequently, your consideration for use of the Script is your agreement to these terms, including the various waivers, releases and limitations of your rights and Datavail’s liabilities, as setforth herein.
  2. LICENSE. Subject to the terms herein, the Script is provided to you as a non-exclusive, revocable license to use internally and not to transfer, sub-license, copy, or create derivative works from the Script, not to use the Script in a service bureau and not to disclose the Script to any third parties. No title or other ownership of the Script (or intellectual property rights therein) is assigned to you.
  3. USE AT YOUR OWN RISK; DISCLAIMER OF WARRANTIES. You agree that your use of the Script and any impacts on your software, databases, systems, networks or other property or services are solely and exclusively at your own risk. Datavail does not make any warranties, and hereby expressly disclaims any and all warranties, implied or express, including without limitation, the following: (1) performance of or results from the Script, (2) compatibility with any other software or hardware, (3) non-infringement or violation of third party’s intellectual property or other property rights, (4) fitness for a particular purpose, or (5) merchantability.

You hereby release Datavail from any claims, causes of action, losses, damages, costs and expenses resulting from your downloading or other use of the Script.

  1. AGREEMENT. These terms and conditions constitute your complete and exclusive legal agreement between you and Datavail.


How to Solve the Oracle Error ORA-12154: TNS:could not resolve the connect identifier specified

The “ORA-12154: TNS Oracle error message is very common for database administrators. Learn how to diagnose & resolve this common issue here today.

Vijay Muthu | February 4, 2021

Data Types: The Importance of Choosing the Correct Data Type

Most DBAs have struggled with the pros and cons of choosing one data type over another. This blog post discusses different situations.

Craig Mullins | October 11, 2017

How to Recover a Table from an Oracle 12c RMAN Backup

Our database experts explain how to recover and restore a table from an Oracle 12c RMAN Backup with this step-by-step blog. Read more.

Megan Elphingstone | February 2, 2017

Subscribe to Our Blog

Never miss a post! Stay up to date with the latest database, application and analytics tips and news. Delivered in a handy bi-weekly update straight to your inbox. You can unsubscribe at any time.

Work with Us

Let’s have a conversation about what you need to succeed and how we can help get you there.


Work for Us

Where do you want to take your career? Explore exciting opportunities to join our team.