Thursday, December 15, 2022

Be careful when using SQL reserved words in your D365FO data entity field names

I though I would share my last troubleshooting that almost caused me brain damage.

A customer complained about a new custom data entity that was working fine when exported to a plain CSV file, but when directing it to a BYOD database it failed to complete.

The error message was: “An Error occurred in add data flow component for Entity Store” which of course does not give us much to go for.

The first thing I did was to remove all but one field mapping in the Data management export project which then completed without errors. Then I kept adding fields until it failed and by doing so, I discovered which field was causing the issue. The field name was “IsAutomaticalSelect”, which is a valid D365FO field name, but whenever that field was included, the export failed.

I then created a local BYOD database on my developer box and ran the export there and sure enough I got the same issue. But now I had full access to the event logs without having to use LCS monitoring, and started traversing through the logs in these three DIXF event folders:

Looking through the error events in the logs can be daunting, but  I found one called “AddEntityDBDataFlowComponent”, which turned out the be the one to inspect in this case.

The following is a similar example using CustCustomerGroupEntity where I added the field NameSelect as a duplicate of the field Description.


At the bottom we find the SELECT statement of the package and by closely examining the code, I found that the statement was not valid. 

In the first yellow marked part in the extracted statement below, the field NAMESELECT is not followed by a comma, which means that the N’Test-BYOD’ is considered part of the field definition.

The second yellow part on the other hand has one comma too many between SYNCSTARTDATETIME and ISNULL.

SELECT N'Test-BYOD' AS DEFINITIONGROUP

            ,N'Test-BYOD-2022-12-15T17:36:25-EAF0BF4CAF964D8BB3EA6AD99BF1C159' AS EXECUTIONID

            ,0 AS ISSELECTED

            ,0 AS TRANSFERSTATUS

            ,0 AS ISPUBLICSECTOR_IT

            ,N'initial' AS PARTITION

            ,N'XXXX' AS DATAAREAID

            ,GETUTCDATE() AS SYNCSTARTDATETIME

            ,ISNULL(T1.CLEARINGPERIODPAYMENTTERMNAME, '') AS CLEARINGPERIODPAYMENTTERMNAME

            ,ISNULL(T1.CUSTOMERACCOUNTNUMBERSEQUENCE, '') AS CUSTOMERACCOUNTNUMBERSEQUENCE

            ,ISNULL(T1.CUSTOMERGROUPID, '') AS CUSTOMERGROUPID

            ,ISNULL(T1.DEFAULTDIMENSIONDISPLAYVALUE, '') AS DEFAULTDIMENSIONDISPLAYVALUE

            ,ISNULL(T1.DESCRIPTION, '') AS DESCRIPTION

            ,ISNULL(T1.ISSALESTAXINCLUDEDINPRICE, 0) AS ISSALESTAXINCLUDEDINPRICE

            ,ISNULL(T1.NAMESELECT, '') AS NAMESELECT N'Test-BYOD' AS DEFINITIONGROUP

            ,N'Test-BYOD-2022-12-15T17:36:25-EAF0BF4CAF964D8BB3EA6AD99BF1C159' AS EXECUTIONID

            ,0 AS ISSELECTED

            ,0 AS TRANSFERSTATUS

            ,0 AS ISPUBLICSECTOR_IT

            ,N'initial' AS PARTITION

            ,N'XXXX' AS DATAAREAID

            ,GETUTCDATE() AS SYNCSTARTDATETIME

            , ,ISNULL(T1.PAYMENTTERMID, '') AS PAYMENTTERMID

            ,ISNULL(T1.TAXGROUPID, '') AS TAXGROUPID

            ,ISNULL(T1.WRITEOFFREASON, '') AS WRITEOFFREASON

FROM CUSTCUSTOMERGROUPENTITY T1

After dunking my head against the wall for a while, I realized that SELECT is a SQL reserved word and it hit me that there might be a connection. Sure enough as soon as I renamed the field to "NAMESELECTION" it worked fine.
 
I did tried naming the field “NAMEWHILE” and that did not fail, so it might just be the SELECT postfix that is producing this error.


Wednesday, March 10, 2021

Installing ISV licenses

 

When it comes to license codes received from Independent Software Vendors (ISV), the right way to install these are through the LifeCycle Services (LCS) Asset Library by adding them to your build pipeline like this: https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/dev-tools/pipeline-add-license-package

But usually, you do not want to install you builds on a developer box (tier 1), because that will deploy a lot of binary files that you do not need. So, what can we do instead? As I see it there are three choices and I will explain each of them a little deeper:

  • Install using dedicated license deployable package
  • Install using command line import
  • Install by running SQL script directly on the database

Install using dedicated license deployable package

This method still requires you to install a deployable package as if you were installing a solution, but it will only contain one or more ISV licenses.

To manually create this package, go to you developer environment and the directory K:\AosService\PackagesLocalDirectory\bin\CustomDeployablePackage (might be different drive on you machine).



In here we will find a ImportISVLicense.zip file that we will extract to our favorite working directory, like C:\temp and copy your ISV license files that you received from your supplier into the license directory: C:\Temp\ImportISVLicense\AosService\Scripts\License



Re-zip the file and upload it to you LCS Asset Library and wait for it to validate. Make sure that the Package type says, “ISV License Package” and that it is valid, otherwise something was done wrong.



From hereon you can install the package to your environment by applying the package in the maintenance.

Of course, you can also skip the Asset Library if you prefer to manually apply the deployable package from the command line in you environment: https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/deployment/install-deployable-package

Install using command line import

Your license text file can also be applied to your environment if you have elevated privileges, which we did not have on Microsoft managed Tier 1 environment, but since they are history it should be possible on you own hosted machines.

You will need you license file in an accessible location, like C:\temp and the credentials of a SQL user with administrative access to you location database. Usually that would be the axdbadmin account from the environment details in LCS.



Open a command prompt as administrator and run the following scripts (Check that the drive letter, directories, and database match your environment):

K:\AOSService\PackagesLocalDirectory\Bin\Microsoft.Dynamics.AX.Deployment.Setup.exe --setupmode importlicensefile --metadatadir K:\AOSService\PackagesLocalDirectory --bindir K:\AOSService\PackagesLocalDirectory --sqlserver . --sqldatabase AxDB --sqluser axdbadmin --sqlpwd [axdbadmin password] --licensefilename c:\temp\[LicenseFileName].txt

Wait for the execution to finish which should not take many minutes and the restart your “World Wide Web Publishing Service”.

 


Install by running SQL scripts directly on the database

The last method is a little more hard-core and has only been proven to update an already installed license code. Like the last method you will need administrative access to your local database.

If you open your ISV license text file, you will find one or more rows of XML with license information like this:



So, the license if nothing more than text that we can copy into a SQL script, but we just need to find out where to put it.

The license information in D365FO is in the SysConfig table so running a SELECT * FROM SYSCONFIG will give you the current installed licenses.



We can see that there are some similarities between the table fields and the XML tags in the license file, so what we need is the ID value, which relates to the ISV solution license code element in the AOT.

Lookup you ISV solution license code and check out the properties to find the PublicKey ID. That is what we will use to find the right license code in the SysConfig table.

 


 

So running the script below would update the existing license with new signature and expiration date:

update SYSCONFIG

set VALUE = '[Signature string from file]',

       SHADOWVALUE = '[Signature string from file]',

       EXPIRATION = '[Expire date from file]',

       TIMESTAMP = '[Timestamp date from file]'

       where ID = 100001

You should synchronize your database from Visual Studio afterwards, but this should update your ISV license as well.

Disclaimer:

As I wrote in the beginning of the section, I have only tried this to update an existing code and not installing of brand new one. There might be something that will need attendance, regarding the certification serial number that is taken care of automatically when installing through a deployable package.

Monday, February 22, 2021

Dealing with TLS Cipher suites in HTTPS calls on cloud hosted environments (CHE)

Another fine day at the office gave me a challenge that I met before but never really investigated. It has to do with HTTPS calls that fails due to difference between supported TLS cipher suites on server that host a web service and the client consuming it.

I always though that if I ask the client to make a TLS 1.2 call and the server supports TLS 1.2 then everything should be in order. But that is not necessarily correct because of the Cipher suites that each end supports.

An explanation of ciphers is here https://blog.keyfactor.com/cipher-suites-explained if you are into that kind of stuff.

My assignment was to make a call to a semi-public webservice (just need a API-key) from D365 FO but as usually I did a few tests from Postman and C# from my own PC before hitting the D365 FO CHE.

In Postman I immediately got a valid response so I went to my trusted VS 2019 IDE and wrote this test program and again I got the expected answer:

 

However, when I moved the test program to my CHE and ran it, I got the error message:

“The request was aborted: Could not create SSL/TLS secure channel”


I used the Qualys SSL Labs (https://www.ssllabs.com/ssltest/analyze.html) tool to check that the service actually supported TLS 1.2 and it did.


I fired up Fiddler Everywhere (https://www.telerik.com/download/fiddler/fiddler-everywhere-windows) and found that even though the request was made with TLS 1.2 there was no overlapping supported ciphers.

My CHE client did not have any TLS_ECDHE_ECDSA_* ciphers which was all the service supported.


So, I checked that the Window server 2016 supported the needed cipher through this link: https://docs.microsoft.com/en-us/windows/win32/secauthn/cipher-suites-in-schannel

And that D365 FO apps also supported the right ciphers through this link: https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/sysadmin/encryption

 

Everything seems to be alright and supported, but still the call would only used unsupported ciphers.

Both Windows server 2016 and D365 FO supported these 4 ciphers, but the client did not:

  • TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256 (0xc023)                                         
  • TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 (0xc02b)                                       
  • TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 (0xc024)                                         
  • TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384 (0xc02c)                                       

At last, I got a hint from Microsoft support that I should check the registry: HKLM:\SOFTWARE\Policies\Microsoft\Cryptography\Configuration\SSL\00010002\ and there I found that only part of the server supported ciphers was listed.

I added the 4 ciphers above like this:

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Cryptography\Configuration\SSL\00010002]

"Functions"="TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384,TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA,TLS_RSA_WITH_AES_256_GCM_SHA384,TLS_RSA_WITH_AES_128_GCM_SHA256,TLS_RSA_WITH_AES_256_CBC_SHA256,TLS_RSA_WITH_AES_128_CBC_SHA256,TLS_RSA_WITH_AES_256_CBC_SHA,TLS_RSA_WITH_AES_128_CBC_SHA"

That made a difference, and everything worked.

 

Tuesday, November 17, 2020

Change Tracking in Dynamics 365 FO

 

This blog post is intended to give you some idea on how Change Tracking is used when exporting data through entities in Data Management.

It is not going to explain the basic principles of SQL Server Change Tracking (see About Change Tracking) and how the simple task of configuring the Data Management export projects, which I assume you either already know or can read about elsewhere (see Enable Change Tracking for entities).

I will try to explain what happens behind the scenes when the configuration is enabled or changed, and when data is exported.

Tables configured with Change Tracking

Let us start by investigating which tables are enabled for Change Tracking in the SQL database:

SELECT
   sct1.name AS CT_schema,
   sot1.name AS CT_table,
   ps1.row_count AS CT_rows,
   ps1.reserved_page_count*8./1024. AS CT_reserved_MB,
   sct2.name AS tracked_schema,
   sot2.name AS tracked_name,
   ps2.row_count AS tracked_rows,
   ps2.reserved_page_count*8./1024. AS tracked_base_table_MB,
   change_tracking_min_valid_version(sot2.object_id) AS min_valid_version
FROM sys.internal_tables it
JOIN sys.objects sot1 ON it.object_id=sot1.object_id
JOIN sys.schemas AS sct1 ON sot1.schema_id=sct1.schema_id
JOIN sys.dm_db_partition_stats ps1 ON it.object_id = ps1. object_id AND ps1.index_id in (0,1)
LEFT JOIN sys.objects sot2 ON it.parent_object_id=sot2.object_id
LEFT JOIN sys.schemas AS sct2 ON sot2.schema_id=sct2.schema_id
LEFT JOIN sys.dm_db_partition_stats ps2 ON sot2.object_id = ps2. object_id AND ps2.index_id in (0,1)
WHERE it.internal_type IN (209, 210)
ORDER BY Tracked_Name

This script lists all tables in the current database which have been activated for change tracking. We see that only the CustGroup table is currently enabled.






Details about the tracked changes are kept in two places:
  • Syscommittab holds information on all changes on a high level and a pointer to the details in the side tables
  • Side tables with the naming conversion “change_tracking_[unique ID]”

Change Tracking level on data entities

Enabling a data entity for Change Tracking enables the individual tables in the entity based on the Change Tracking level.

Consider the data entity “Fleet Management Rentals” (FMRentalEntity) which consist of three tables: FMRental, FMCustomer and FMVehicle


Primary table

Enabling Change Tracking for Primary table on the entity only enables the top-level table (FMRental) of the entity data source.




Entire entity

Enabling Change Tracking for Entire entity enables all tables of the entity data source.





Custom

Enabling Change Tracking for Custom query enables the tables of the entity data source that are added by code to the default Change Tracking query.

The sample below adds the two tables FMRental and FMCustomer to the query, making them the ones that are change tracked. This means only changes to these two tables will trigger and export of the FMRental entity records.












Disabling change tracking

Disabling Change Tracking on the entity “Fleet Management Rentals” (FMRentalEntity), will remove all tables in the entity data source to a certain point. I experienced that sometimes not all tracked tables are removed. Not sure which tables are left in for tracking, but it seems to be tables possibly tracked by other entities, like DirPartyTable, InventDim etc.

Change Tracking on incremental export

Adding a data entity to a Data Management export project for the first time with Default Refresh Type = Incremental and running the initial export, creates a record in the table DMFEntityDbSyncVersion. This record holds the latest synchronization version for the current data entity in the Definition group in each legal entity. So, the key is: Definition group, Data area id, Data entity name.

After each export, the synchronization version number is set to the current database Change Tracking version.







The CHANGE_TRACKING_CURRENT_VERSION() returns the highest change tracking number for the database, so setting the definition group synchronization version to this number, will make sure that no older records will be re-exported.





So, next time the export is run, the current synchronization version for the data entity is validated against the change tracking side table versions for each table in the entity data source marked for Change Tracking. If records exist with higher version numbers they are considered changed and will be exported.

Querying current tracked changes

After creating two new rentals in the FMRental table we can check out what is expected to be exported in the next run by issuing the following script:

DECLARE @last_sync_version bigint; 
SET @last_sync_version = (SELECT AXSYNCVERSION FROM DMFEntityDBSyncVersion
                 WHERE DEFINITIONGROUPNAME = 'Fleet_Rental_Export')
SELECT @last_sync_version AS LAST_SYNC_VERSION,SYS_CHANGE_VERSION, FMRental.* FROM FMRental
RIGHT JOIN CHANGETABLE(CHANGES FMRental, null) AS CT ON FMRental.RECID = CT.RECID

First the @last_sync_version variable is initialized with the current synchronization version that the definition group has. This is used to compare against the current tracked changes version number.




As we can see the LAST_SYNC_VERSION is only higher than the last two rows in the change list, so we will expect two rows exported on the next run.









Again, the definition group is marked with the current highest change tracking number, so the same records are not exported next time.




Observations

data entity Change Tracking starts at the moment of enablement, which means that all tables created or update before this point is not tracked. That is why initial export is always a Full push. It is not possible to manually add records to the change tracking, unless you somehow change them through D365

          Deletion is not supported for plain data entity exports like files, which makes sense since how would you export a record that has been deleted to a file? Tracking deletions is supported for export to Bring You Own Database (BYOD)