Salesforce Commerce Cloud B2C Data Migration — Introduction to Data Scoping (Part 1)
An article series about data migration from legacy systems to Salesforce B2C Commerce Cloud
Why am I writing about this?
In my previous blogs, I wrote only one article about data migration, which was particularly focused on password migration. I highlighted there several possible data objects from the standpoint of business capabilities. Even then, it did not cover all the best practices I knew and basically how we could execute such a migration process. However, you can find there practical and applicable up-to-date strategies on migrating customers’ passwords. Those techniques are commonly used today by SMEs (Subject Matter Experts) across the globe.
In consideration of the conversations in the Salesforce B2C Commerce Community between last year and our current year 2021, I have been motivated to start a blog series to cover everything I know about data migration and I find it a privilege to be able to share that knowledge with you. By the way, if you want to know what Salesforce B2C Commerce Community is and how you can join that amazing place with outstanding people and experts, I welcome you to read the article below.
What will I write about?
At the point when I was doing a personal draft of the content I wanted to cover, I came up with a list of topics that I believe deserve a separate article for each topic under data migration. This includes:
- Data migration scoping.
- Data migration modeling.
- Data migration strategy.
- Data migration rehearsal strategy.
- Data quality for data migration.
- Data migration timeline and RACI matrix.
In my experience, Rehearsal strategy is one of the most undocumented and untalked-about topics I wish to write about in one of my future articles. Today though, let’s talk about Data Migration Scoping, what it is and how we can integrate it into our e-commerce business processes.
Data migration scoping
Normally, when speaking to anyone about data migration, the first few questions that come to mind are:
- Why do we want to migrate data?
- What data do we want to migrate?
- What data do we not speak about, but eventually, it’s part of the data we need to migrate?
- What is the volume of the data in question?
- Do all stakeholders understand what we need to scope for data migration?
- Does everyone understand what scoping is?
Coming up with those questions, I help myself and others to align what we are doing and to define the scope of data migration. At this particular moment, we don’t try to map legacy data with data that will be stored or managed on the B2C Commerce Cloud. We don’t talk about how exactly we are going to migrate that data. Whatever we discuss during data migration scoping is only about understanding What? and Why? That “what” and “why” will be our drivers for all future steps during data migration and the decisions that we will have to make. Usually, it is best practice to have these discussions during pre-sales on high-level, but certainly not after the gap analysis or first few discovery workshops has passed.
How can we accelerate identifying “What”?
The first three questions listed in this article are questions that will be necessary to ask when it comes to e-commerce and in particular, Salesforce B2C scenarios where data migration is almost always a fixed number of data points that any business should consider when planning their e-commerce business processes. For example, if you open any existing website homepage, the first thing you would think to migrate would be the content behind the content we have of those UI/UX components, but more precisely in the scope of data migration, we are talking about the images and textual artifacts behind those UI/UX components. Browsing e-commerce sites, we would identify categories of the product catalog that would require migration and also products and their relative connection to categories, the same with product images and product description. Often with product components, you would see ratings and reviews, data that you would like to see in a migrated website, and prices with the inventory of those relative products. After manual navigation, I would try to search for something and most likely would see search suggestions, search autocomplete, that is, the logic behind this that would rely on common dictionaries or exclusive rules. If I’m trying to find some stores of the brand I’m navigating, then I would definitely encounter the store locator page which would be driven by the store data and its respective geolocation.
Now let’s stop for a second. What do we have so far?
If we speak in Salesforce B2C jargon, our list would be:
- content slots and content assets,
- static content images,
- product navigation catalog with navigation rules such as sorting rules and refinements,
- product master catalog,
- rating and reviews,
- product pricebook,
- product inventory,
- physical stores data with longitude and latitude coordinates,
- search rules with dictionaries.
And that would be only the beginning, as after those validations, we would definitely look at some promotions and long-running campaigns (e.g. customer loyalty programs), we would also try to add a product to the cart and then navigate to the checkout page, we would identify at least shipping and payment methods that as a business stakeholder, that is what I would expect to see on the migrated platform. Let’s assume we did it and now we wish to check the user experience for a ‘known customer persona’ that is, a person that has an account on the website. Such a persona (account data type) would have different data entities that a business would record, store, and use for advanced capabilities, hence, we would expect such data entities to be migrated as well.
So, in effect, I had to migrate customer objects (groups of data), as well as customer objects, worked on by other experts. These include:
- customer login credentials including SSO such as Facebook or LINE login data,
- customer profile information with shipping and billing addresses,
- customer credit cards or other payment instruments,
- customer product lists,
- customer auto-replenishment data,
- customer orders from online and offline experiences,
- customer loyalty data and external IDs to CRM or other platforms.
Since we are doing all that navigation, we need to use a browser with the respective URLs during the migration to the new platform, however, those URLs would most likely change. So how would we maintain a smooth-flowing customer experience when directing them to saved browser pages or google index pages with well-positioned URLs? To achieve this, we must migrate the URLs by following the 301 redirection approach, but the details of that will be discussed in a future article. At the same time, when we speak about URLs, there are other things to migrate such as the URL rules or things that would resonate with you such as SEO that consists of tagging plan, canonical URLs, SEO configurations, meta links, and so forth. All those SEO artifacts would be part of the migration data as well.
It’s already a lot… do I really need all of this???
To approach data migration discussions, it would be necessary for stakeholders to have a good knowledge of the platform we are migrating to and the common data sets that have to be configured for the site to be running efficiently. This would also mean that SMEs will need to employ excellent analytical skills during this scoping phase and do all the best there is to discover “is now” (in the legacy) data landscape and knowing “to be” data landscape, understand what are key business capabilities, and what data is most important. Also, we would need to know the volume of the data in question, hence, metrics from the existing system would be a must-have for such activities to take place. Doing this, you would group all this information and meet together with IT professionals and business stakeholders to go through those six (6) questions we listed at the beginning of this article as well as follow the steps shown in the slides to describe the “scope” or get the “answers” as we (delivery team) understand it.
The full list of the data points covered in this article are:
- content slots and content assets (including translation for different locales),
- static content images,
- product navigation catalog with navigation rules such as sorting rules and refinements, (including translation for different locales),
- product master catalog (including translation for different locales),
- rating and reviews,
- product pricebook,
- product inventory,
- the physical store(s) data with longitude and latitude coordinates,
- search rules with dictionaries,
- customer login credentials including SSO like Facebook or LINE login data,
- customer profile information with shipping and billing addresses,
- customer credit cards or other payment instruments,
- customer product lists,
- customer auto-replenishment data,
- customer orders from online and offline experiences,
- customer loyalty data and external IDs to CRM or other platforms,
- SEO-friendly URLs, legacy URLs, page meta tags, canonical URLs.
…did I miss anything? :)
I would honestly expect you to ask yourself right now:
How would I execute data migration scoping using knowledge from this article?
My name is Oleg Sapishchuk, and I’m an experienced Solution Architect providing digital transformation with unified commerce solutions for some of the world’s best-known and most influential brands. Salesforce B2C Commerce is the topic that I write about most frequently. If you are interested in new or previously written material, I invite you to follow my Medium profile.