Tuesday, 27 November 2012

Outlook data file lock & Lync

In brief: Running Lync with administrator right will lock Outlook data file (the *.ost file that sync with Exchange, not the *.pst).

In detail
On a rare occasion, I found that whenever I start up Outlook 2013, I hit the error message below and it prevent the Outlook from starting up.

---------------------------
Outlook Data File
---------------------------
The file C:\Users\<UserId>\AppData\Local\Microsoft\Outlook\user.name@domain.com.ost is in use and cannot be accessed. Close any application that is using this file, and then try again. You might need to restart your computer.
---------------------------
OK  
---------------------------

It happens after I done archiving email, closed Outlook to make a backup of the archived PST file to a backup location. Last time when I hit the error (not on the *.ost file, but on the *.pst file) it was due to the Windows Search services is locking up the *.pst file for indexing. Stopping that service will resolve it, just need to start up the service again after Outlook is started.

But this time, I tried the same method, it doesn't work. I then recalled Lync is storing message history in Outlook, so it could be the locking source. I closed Lync and start up Outlook again, the error doesn't occur, resolved.

Out of curiosity, I was wondering why would Lync lock it that way? Then another thought came through, the "Run as administrator". I run the Lync with administrator right due to I need to do a screen sharing of another program that's running under administrator right.

I done a few test, running Lync normally won't lock it, and running Lync as administrator will lock it.

So if the machine has both Outlook and Lync, make sure:

  1. If Lync doesn't needs administrator right, Outlook and Lync start up sequence doesn't matter
  2. If Lync needs administrator right, start up Outlook before Lync

Monday, 26 November 2012

AX2009 report showing square block for double-byte language (Eg. Chinese, Korean, or Japanese characters)

When double-byte characters (Eg. Chinese, Korean, or Japanese characters) is used on AX2009 report, if they're showing up as square block instead of the proper characters, the fix is to change the font under the user options.

1. Options > Font (tab)
2. Change the Font under the "Report" group to "Arial Unicode MS"






Friday, 16 November 2012

Misleading "The record already exists" message

The error message "The record already exists" is quite common in AX especially during development.
Point 1 - 3 is quite common, the reason for this post was Point 4, which is new to me and we just hit that error yesterday.

These are the few area/direction that we usually start investigating:

  1. Unique index
    This is the most common reason, one of the unique index field has duplicated value.
    This can easily be checked by browsing the record with table browser.
     
  2. Uncommitted record
    If the error happen during a process, it is possible that the record has been inserted once, but due to it hasn't been commited yet, if you browse the data with table browser in another AX session, you won't see it.

    One of the method is to select the data from SQL with the option
    WITH(NOLOCK).
    Eg.
    SELECT * FROM TableA WITH(NOLOCK) WHERE FIELD1 = 'abcd'

    This has been quite useful during debugging large process. Eg. Inventory posting, movement, etc.
     
  3. Duplicated RedId
    This usually won't happen, but during development with a team of developers, you might not know if someone has done something directly from SQL (Eg. moving records between environment, DTC from one place to another)

    After AOS is started up, when the first time it insert record into a table, it will first retrieve the value from SystemSequences.NextVal, then update it by add 250 to it. So the AOS cached 250 RecId for this table and use it for subsequent insert. When this cached RecId is finished, it will then read and do the same thing again.

    This duplicated RecId happens when the SystemSequences.NextVal is lower than the current max(RecId) of the table where the record is going to be inserted (target table). If there're gap between the RecId, the error might not occur immediately, but eventually it will.

    Eg.
    SystemSequences.NextVal = 5678900001
    AOS cached the RecId between 5678900001 - 5678900250, then updated SystemSequences.NextVal = 5678900251
    The max(RecId) in the target table is 5678900747

    If the RecId in the target table doesn't have gap, it will hit error immediately when it try to insert (5678900001, 5678900002, 5678900003, and so on from the 250 allocated/cached RecId).
    If there're gap, depends on the RecId allocated from AOS and the existing RecId in target table, it eventually will hit the error.

    But the error message in AX will be misleading when the error occur.
    AX tends to show the record already exist with the value from TitleField1 and TitleField2.

    To fix this, update the SystemSequences.NextVal = Target table's max(RecId) + 1, then restart the AOS. Beware of multi-AOS environment, each AOS cache their own set of 250 RecId, you'll have to restart all AOS.

    *CAUTION: Only update the SystemSequences record where the field TABID = <Table Id>. Only one record.
     
  4. Default value in table definition has been altered
    Unfortunately I can't replicate the error message, but I'll try to explain what we hit yesterday.


    We got an error message of "The record already exists", so the first thing is to check the above 3 reasons, but none of that is the main cause.

    We then check the Event log and seen an error message complaining cannot insert NULL value into CREATEDDATETIME field. So we got an error message in AX that's different than the one showing up in Event log.
     

    Eg. 
    Object Server 02: The database reported (session 3 (PChan)): [Microsoft][SQL Native Client][SQL Server]Cannot insert the value NULL into column 'CREATEDDATETIME', table '<dbname .dbo.tablename=".dbo.tablename">'; column does not allow nulls. INSERT fails.

    When the "CreatedDateTime" property of the table in AX is set to Yes, AX automatically create the field in SQL with a default value definition. But somehow, a mystery person tampered with the table (maybe by copy table between environment directly through SQL) and forgotten this, the default value definition is missing and that has cause the error. The screenshot below, on the left, is the tempared table where the default definition is missing, on the right is the correct version.




Monday, 12 November 2012

Read unicode CSV file with thousand separator or comma in text value

There're a few classes available in Dynamics AX for reading text or CSV file.
When Unicode and value with comma as text is involved, choosing the correct class is important.

Below are some of the classes and their hierarchy.

Class name      Class declaration
==============  =================================
IO              class Io          extends Object
+- CommaIo      class CommaIo     extends Io
|  +- TextIo    class TextIo      extends CommaIo
|  +  Comma7Io  class Comma7Io    extends CommaIo
|  +- AsciiIo   class AsciiIo     extends CommaIo
+- CommaTextIo  class CommaTextIo extends Io

Given a CSV file with the following line:
,,미지정,외환은행,123-123456-123,외환체크사용,2012/10/05,KRW,,"2,350","11,563,531"

It contain both Unicode text (Eg. 외환은행) and value with comma as value (Eg. 2,350).
The 2,350 needs to be read as 2350 instead of 2 and 350.
The screenshot below shows how each class read the line.

CommaTextIo correctly read Unicode text and amount with thousand separator

CommaIo correctly read the amount with thousand separator but not the Unicode text

TextIo correctly read Unicode text but not the amount with thousand separator (The amount supposed to be 2350 not 2 and 350)

AsciiIo incorrectly read both Unicode text and amount with thousand separator

Below is a sample code used to read the CSV file, the screenshots above are captured by running the job below for each of the class mentioned (CommaTextIo, CommaIo, TextIo, & AsciiIo).

static void TestReadCSV(Args _args)
{
    #File
    #define.comma(',')
    CommaTextIo        commaTextIo;
    container          lineCon;
    ;
    
    commaTextIo = new CommaTextIO(@'C:\UnicodeSample.csv', #io_read);
    commaTextIo.inFieldDelimiter(#comma);
    commaTextIo.inRecordDelimiter(#delimiterCRLF);
    
    lineCon = commaTextIo.read();
    
    while(lineCon && (commaTextIo.status() == IO_Status::Ok))
    {
        info(conPeek(lineCon, 4));
        
        lineCon = commaTextIo.read();
    }
}

Monday, 5 November 2012

Invalid use of the DataAreaId field in a cross-company query

Did a search online and doesn't find much info about this error - "Invalid use of the DataAreaId field in a cross-company query", it might be worth to share what has been found and the workaround.

During the development today, I hit this error, not sure why it doesn't allow the use of DataAreaId in the query criteria on crossCompany query, but there's a quick workaround for this - join it to DataArea table.



The query above had a compile error at 'tableA.dataAreaId' with an error message of "Invalid use of the DataAreaId field in a cross-company query". To workaround this error, joining it to DataArea table would fix it.

Instead of explaining in plain text, it is easier to understand by seeing the code and result.
Below are the sample data and the query to help further elaborate it.

TABLEA
DATAAREAID FIELD1     FIELD2     FIELD3     FIELD4
---------- ---------- ---------- ---------- ----------
ceu        CEU_R1_F1  CEU_R1_F2  CEU_R1_F3  CEU_R1_F4
ceu        CEU_R2_F1  CEU_R2_F2  CEU_R2_F3  CEU_R2_F4
con        CON_R1_F1  CON_R1_F2  CON_R1_F3  CON_R1_F4
con        CON_R2_F1  CON_R2_F2  CON_R2_F3  CON_R2_F4
dat        DAT_R1_F1  DAT_R1_F2  DAT_R1_F3  DAT_R1_F4
dat        DAT_R2_F1  DAT_R2_F2  DAT_R2_F3  DAT_R2_F4

TABLEB
DATAAREAID FIELD1     FIELD2     FIELD3     FIELD4
---------- ---------- ---------- ---------- ----------
ceu        F1         F2         F3         F4
con        F1         F2         F3         F4
dat        F1         F2         F3         F4


Work around by joining to DataArea table
while select crossCompany tableA
        join dataArea
       where dataArea.id       == tableA.dataAreaId
        join tableB
       where tableB.dataAreaId == dataArea.id
{
    info(strFmt("%1 - %2 - %3", tableA.dataAreaId,
                                tableA.Field1,
                                tableB.dataAreaId));
}

ceu - CEU_R1_F1 - ceu
ceu - CEU_R2_F1 - ceu
con - CON_R1_F1 - con
con - CON_R2_F1 - con
dat - DAT_R1_F1 - dat
dat - DAT_R2_F1 - dat


Incorrect join
Although TableB has 1 record per company, but joining with crossCompany keyword will result in joining without DataAreaId and cause incorrect query result.

while select crossCompany tableA
        join tableB
{
    info(strFmt("%1 - %2 - %3", tableA.dataAreaId,
                                tableA.Field1,
                                tableB.dataAreaId));
}

ceu - CEU_R1_F1 - ceu
ceu - CEU_R2_F1 - ceu
con - CON_R1_F1 - ceu
con - CON_R2_F1 - ceu
dat - DAT_R1_F1 - ceu
dat - DAT_R2_F1 - ceu
ceu - CEU_R1_F1 - con
ceu - CEU_R2_F1 - con
con - CON_R1_F1 - con
con - CON_R2_F1 - con
dat - DAT_R1_F1 - con
dat - DAT_R2_F1 - con
ceu - CEU_R1_F1 - dat
ceu - CEU_R2_F1 - dat
con - CON_R1_F1 - dat
con - CON_R2_F1 - dat
dat - DAT_R1_F1 - dat
dat - DAT_R2_F1 - dat