The design and development phase

During the design and development phase, you will need to work on data migration actively. The key steps during this phase include data mapping and transformation for the identified migration elements, creation of tests and a go-live plan for migration as well as developing the scripts, templates, and test datasets for migration.

Data mapping and transformation

The following are the key steps for managing the data mapping of the source and target systems:

  • Select source systems to use for migration. If the data is stored at multiple places in a legacy system, you should pick the cleanest one to extract a copy. Consider the update timings in the source and add dependencies in the go-live plan to get the source data updated prior to starting the extraction.
  • Define field mapping between the legacy systems and Dynamics AX, along with any transformations that need to happen between the extracts and the import processes.
  • Identify the areas that need data cleansing; have the data cleansing efforts started early on. Define rules or have them addressed in the source systems (for example, bad addresses, phone numbers, and so on).

Planning the data migration

During the design and development phase of the project, you should develop the overall plan for migrating the identified data elements. The following is a list of items to consider when developing your plan:

  • The data migration environment: Plan for an environment to run data migrations iteratively. You don't want the test environment to be messed with every week, while the data migration team is still trying to stabilize the data migration processes.
  • Plan for multiple cycles of data migration that are a few weeks apart. This allows time to validate the data, fix issues, and improve the performance of the migration processes.
    • Business resources will be required to help extract and validate the data for each cycle. They may need to help cleanse the data if you run into issues from the legacy data.
    • IT resources will be required to extract, import, and validate data. It is a good idea to train and utilize junior resources in the data conversion execution process as it is an iterative process and have experienced resources to focus on improving the process based on the feedback received from data validation.
    • Data quality in the source system has a huge impact on the number of data migration iterations that you have to perform during tests.
  • Complete a full data migration prior to starting system integration testing, UAT, and training. These migrations should be performed by following the data migration process documentation, and the time for each step needs to be recorded. As part of this process, have the migrated data validated by the business prior to starting the tests in these environments.
  • Come up with iterative/automated processes, which include data extraction from the legacy systems. This makes the cycle time for data migration shorter, improves the quality, and provides consistent results (for extraction, you may be lucky to get away with the reports that the business uses. For example, if a business uses a detailed AR aging report, you can use that report as an input for migration rather than building a separate job for data extraction).
  • The team should record the timing for each process and arrange dependencies and processes that can be run in parallel. As noted earlier, using the Sure Step data migration requirements spreadsheet can aid in documenting this.
  • Document the migration process end-to-end, from data extraction and intermediate validation to migration (the development team that writes the code should not be the one executing it). With a documented process, you can engage the junior team members to execute the repetitive data migration processes. You can save the senior resources to work on other parts of the project.
  • Visual communication for stakeholders: Demonstrate a visual presentation of the data migration process to communicate its progress.
    Planning the data migration

Selecting the tools for data migration

In this section, we will talk about the most commonly used tools including Data Import/Export Framework (DIXF) shipped with Dynamics AX. Microsoft has made huge investments in the Data Import/Export Framework, and this has been a big step in solving the challenges in data migration. Although the introduction of stable and rich features in DIXF has reduced the need for other tools for migration, we will discuss several options for migration.

How do I select the right tool?

There are several factors that you should consider:

  • Do you have something to reuse (integrations, or import builds for the project)?
  • How much is the data volume? Hundreds, thousands, or millions of records?
  • What kind of validation or transformation is required?
  • What is the format of the source data? CSV, SQL, Flat files, and the like?
  • How much downtime is available for migration? This would drive the investments that you need to make in building additional features rather than using the standard tools that are available.
  • What is the trade-off between validations and import time, and the quality of the source data?

Data migration versus data entry

It is often easier to simply enter the data than to migrate it, provided that the dataset to be migrated contains only a few hundred records that can be entered in a timely manner and with relatively few errors. Part of the consideration of whether to manually load the data is the required timing—if you can do it ahead of time or post-release, manually entering the data is a good method. If the data load needs to take place during the downtime and on a critical path, then it could be tricky.

Data import features developed on the project

Going forward, you might have to use custom features for imports like the general journal upload and fixed assets import. Try to leverage similar import programs for data migration.

The Data Import/Export Framework

The Data Import/Export Framework (DIXF) is an extension module to help export and import data in Dynamics AX. DIXF is now an essential part of Dynamics AX R3. It can be installed explictly and used with Dynamics AX R2 and the RTM release as well.

The common pattern for the use of DIXF is shown in the following diagram:

The Data Import/Export Framework

The framework is commonly used in the following areas:

  • Master data, open orders, on-hand inventory, and balances
  • Moderate to high numbers of records (a few thousand to a couple of hundred thousand records)
  • Low customization

The framework is not recommended in the following areas:

  • Highly customized areas
  • A huge volume of data (then you need to bypass the Dynamics AX business logic)

Terminologies

Let's first understand the common terms used with DIXF. They are explained as follows:

  • Source: This is the external data source from where we want to import data into Microsoft Dynamics AX 2012.
    • File: Text (delimited and fixed width), Microsoft Excel, and XML
    • ODBC: Database , Microsoft Excel, Microsoft Access, and so on
  • Staging: This is the intermediary table inside Microsoft Dynamics AX 2012.
  • Target: This is the entity inside Microsoft Dynamics AX 2012 for which we want to import data from an external data source. For example, customer, vendor, and so on.

Architecture

The following diagram shows the architecture of the Data Import/Export Framework. It can basically be understood as a Source | Staging | Target process:

Architecture

The Data Import/Export Framework creates a staging table for each entity in the Microsoft Dynamics AX database where the target table resides. Data that is being migrated is first moved to the staging table. There you can verify the data, and perform any cleanup or conversion that is required. You can then move the data to the target table or export it.

The import/export process

The following diagram shows the steps that are required to import or export data in Microsoft Dynamics AX:

The import/export process
  1. Determine the source of the data to export or import, and create a source data format for the data. For export, the source is AX. For import, you can use any of the following sources:
    • AX: This imports data from another Microsoft Dynamics AX instance
    • ODBC: This imports data from other databases such as Microsoft SQL Server or Microsoft Access
    • File: This imports data from a fixed-width or delimited text file, XML file, or Microsoft Excel file
  2. Determine the entity to associate with the data. This entity can either be the source of the export data or the target for the import data. You can also use an existing entity or create a custom entity. There are 150 out-of-the box entities and the list is increasing continuously.
  3. Determine the entities that should be imported or exported together, and put all these entities in a processing group. A processing group is a set of entities that must be processed in a sequence, or that can logically be grouped together. The entities in a processing group are exported together, or they are imported together from source to staging and then from staging to target. In a processing group, you also associate each entity with a source data format.
  4. Use the processing group options to either import or export data.
  5. For import, you first import the data to a staging table where you can clean or transform the data as required. You should validate that the data appears accurate and that the reference data is mapped correctly. You then migrate the data from the staging table to the target table. You should validate that the entity appears accurate in the target table.

A summary of key features

  • Compare and copy entity data across legal entities.
  • Entity types: The different types of entities are entity, composite entity, and flat table.
  • Mapper control: This allows flexible mapping, supports complex transformations, and the m:n cardinality between the target entities as shown in the following diagram:
    A summary of key features
  • Parallel execution support from staging to target using task bundling.
  • Folder as input for running periodic import with functionality to move the files to different folders (in-process, error, and completed).
  • Error handling: This includes skipping error rows and errors stored in XML files.
  • Merging data into a single-source column and split during transformation, for example, financial dimensions.
  • Multiple source columns can be used to create records in multiple tables, for example, an address book.
  • Use Data Import/Export Framework as an Application Integration Framework (AIF) service to export data to a file. This feature will allow you to leverage the data migration code for ongoing integrations.
  • Set-based support from staging to target.
  • Default value support.
  • Number sequence support.
  • External key mapping support.
    • Source to target in a single step
    • Multiple AOS support

AIF

This section describes the usage and description of AIF in context of data migration:

The Application Integration Framework is generally used in the following areas:

  • For data from highly structured systems (XML Formatting)
  • Has existing out-of-the-box data inputs

The framework is not recommended in the following situations:

  • Large data sets
  • Complex integration

The features of AIF

This section describes the out-of-the-box features of Dynamics AX:

  • AIF can import many similar records, and repeat the same import at regular intervals, or it can be used for ongoing integrations
  • Value mapping and error handling features provided by AIF can be leveraged
  • Document services must be customized if the underlying tables and entities have been customized
  • AIF uses the Dynamics AX business logic and validation

Custom X++

Custom X++ is commonly used for customizations or customized actions. For example, applying cash discount schedules to specific accounts after migrating vendor invoices or applying custom payment schedules to break the due dates of Open AR records. It is also used for large datasets to be migrated from SQL—a need to release 10 million product variants as part of migration.

It is not useful for entities that are supported by DIXF and can handle the large amount of volume.

Tip

DIXF can help you avoid building a custom code for migration.

Describing custom X++

As the name suggests, you need to write a custom code to import it from a file and apply business rules for the import. In some cases, writing custom code/jobs to import SQL tables and CSV/Flat files for custom processes may be an easier and a faster way to approach data migration.

As this is a custom code, you have more control (and responsibility) over the number of validations you want to add or bypass depending on the pre-validation of source data; standard Dynamics AX business logic does not apply to your data.

In projects that need millions of records to be migrated, you may not be able to afford going through the Dynamics AX business logic. Hence, you need to pick a custom route. You can use the power of SQL to speed up the performance, directly load data into Dynamics AX, and use custom code in Dynamics AX to further process the data. For example, importing data in the staging table using direct SQL statements (you can work around RecId assignment by reserving RecId for the custom staging table through X++ and writing the X++ class, which can be multi-threaded for further processing of data).

To use such an approach, you need a deep understanding of the Dynamics AX processes, its data structure, and how data flows across tables.

Excel add-in

It is commonly used for the following purposes:

  • Spreadsheet-based data, mostly setup tables (for example, payment terms and dimensions)
  • Available document services for AIF
  • It is useful for a small or medium number of records, where performance is not a concern (it is useful for a few hundred records; for more records, the process would be very slow)

An Excel add-in is not useful for larger datasets.

Describing an Excel add-in

Data is validated using the business logic in Dynamics AX. End users can use this method, and it is an opportunity to engage them in the system setup. The following screenshot shows the payment terms set up by uploading data (publishing) through the Excel add-in:

Describing an Excel add-in

Data migration execution tips

This section includes the tips and tricks based on data migration experiences.

Initial templates for business

Define template spreadsheets for common data migration scenarios and collect sample data from the business early on. Use this data for CRP. This will help to identify problems sooner and eliminate surprises later.

Extracting source data into SQL tables

If possible, extract the source data into SQL tables. It is easier to clean and transform data using SQL queries rather than manipulating data in flat files. Data validation can be done directly on the source staging table to avoid repeated failure due to loading bad data and wasting time.

Never rename/repurpose fields

Even if you may never need a particular field, the future versions of AX may eliminate or add extra functionality to that field. Creating new fields in AX is easy.

Considering premigration steps

Come up with an approach to convert the bulky data pieces ahead of time. For example, in a project where we had to migrate 10 million products, we migrated most of them a week earlier and ran them concurrently for the one week period for product maintenance. This reduced our downtime over the release weekend.

Considering postmigration steps

You may have data elements that are not required at go-live (fixed assets migration); try to push them to the following weekend.

Changing SQL to simple recovery mode

Set SQL to the Simple Recovery mode for faster performance. Make sure to set it back when completed.

Multithreading and max DOP

Multithreading can help you speed up the process by executing it in multiple threads. However, it can degrade the performance if you add more threads than what your SQL or AOS can handle. Many times, the Max DOP (Maximum Degree of Parallelism) setting on SQL needs to be changed to allow multiple CPUs per thread in order to get an optimal number of threads that can be executed in parallel.

Index and statistics maintenance

Consider index and statistics maintenance after a bulk load of master data to speed up the dependent processes. If you have indexes that are slowing down your inserts, you will want to disable them during bulk load of data.

Disabling the AX logging

Turn off all database logging in AX; it will significantly slow down the process.

Considering SQL updates on migrated data

Sometimes, it is easier to perform a basic data migration of standard data in AX and then use SQL to update the values. Ensure that your scripts and steps are well documented.

The SQL import – through caution and expertise

Sometimes, it is faster to import high-volume data directly into SQL. You can save your time by bypassing the Dynamics AX business logic, so you only need to run validations on data quality. Generation of RecID in Dynamics AX can be tricky. However, it can be done with resources that are well versed with the Dynamics AX architecture (a number sequence is maintained in the system sequences table for each table ID), and which can draw boundaries between importing through SQL and running through the X++ business logic.

Managing configurations

This is another important area that can easily go out of control if not managed well. You need to have a good change-tracking mechanism for configuring the changes and the resources responsible for a specific area that are making changes in your golden (stage) environment.

Once you have the golden environment set up, take a backup of your database with all the configuration (premigration) data that can be simply restored every time you start a new migration run. If you have the liberty to set up a production environment ahead of time, all the configurations can be made in production. Bring the production data into the migration environment and run migration tests on it. If you cannot set it up and use the production environment for configurations, create the golden box and use it.

Configuration management simplified with DIXF

You can create a template legal entity and use it for setting up new legal entities. You can also copy configurations from one environment to another:

Configuration management simplified with DIXF

Tip

Copy company functionality from AX 2009, and prior versions are no longer available in AX 2012.

Reviewing and deciding on the configuration

There are certain configurations that you need to review carefully on the project, and make decisions for these configurations considering the cross-functional requirements. These are configurations where there is no going back, like the product dimension group, storage dimension group, tracking dimensions, inventory model group, and so on. Each checkbox is going to have a great impact on the outcome, and once you have transactions that take place, you can't change these selections easily. These configurations impact the financials and supply chain aspects, and that is why you need cross-functional reviews.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.17.76.175