Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 25 Next »

When importing data into X Dispatch, it is important to know the structure of an X Import configuration file. There are two types of files, which can be distinguished by their file extension.

  • XINI files are on demand configurations.
  • XASN files are routed configurations.

Imports can use XSLT on an Excel file. Make sure that the XSLT file is in a folder location that is accessible by the import process or by putting the XSLT file in ti X Dispatch as an attachment and putting the attachment is in the import file (XINI/XASN).  use an XSLT on an Excel file  use an XSLT on an Excel file use an XSLT on an Excel file

The following is for both XINI and XASN file types unless otherwise noted. 

Datetime fields must use the format “yyyyMMdd HHmm".

Comments

Comments are indicated by the '#' character and continue to the end of the line and can be at the beginning of a line, or at the end. 

Example
# This is a comment
OriginName=0,CXT Software # This is also a comment

General Configuration Items

Configuration items are set using a Key=Value format. 

Configuration ItemDescription
ConcurrentDataFiles

Optional

This value is set to the number of concurrent data files to be processed and will tell the system how many parallel processes to attempt to create. If this value is set to -1 it will tell the system to create as many as it can.

If this value is NOT set, it will default to 1 to mimic current functionality.

Example
ConcurrentDataFiles=-1
ConcurrentProcesses

Optional

Can be set in the X Stream definition to run multiple XASN/XINI files at the same time.

DataMask

Required

Specifies the extension of the import files (e.g., xml, txt, *).

To disregard the file extension, you may use '*', however, the DataMask item should be set whenever possible to prevent the processing of unintended files. Many times, third-party companies sending the import files will upload the file with a ".tmp" extension and rename the file to the correct extension only once the upload has been completed. If the DataMask is set to '*' these temporary files may be processed before they have been completely transferred.

Example
DataMask=xml
DataPath

Required

Directory to monitor for incoming files.  If the directory is not on the local machine, the fully qualified URI (\\server1\customer1\inbox) must be used. 

Example
DataPath=c:\ximport\inbox
Inherit

Optional

Specifies whether files moved to the “ProcPath” directory should inherit the permissions of that directory when moved.  A value of “1” indicates files should inherit permissions, while a value of “0” indicates the file should retain its current permissions (default behavior). 

Example
Inherit=1
Delimiter

Required

This specifies the type of file to be processed and must be one of:

  • “XML”: The input file is an XML formatted document
  • “0”: The input file is a fixed-width file
  • “TAB”: The input file is tab delimited
  • “,”: The input file is in standard CSV format (See also: "Qualifier" item)
  • Any other character: The input file will be split on this character.
  • "Excel":  This functionality will require the installation of the 32 bit Microsoft Access Database Engine 2010 Redistributable

Example
Delimiter=XML
ErrPath

Required

The directory in which to store any error log files for the import.

Example
ErrPath=c:\ximport\errors
NumLinesSkipped

Optional

Indicates how many lines to skip from the top on an imported flat file. This allows you to bypass header rows in data files. 

Example
NumLinesSkipped=1
PreProcessor

Optional

Full path to the preprocessor executable. 

Example
PreProcessor=c:\ximport\plugins\eTracPreProcessor.exe

If running the preprocessor from a batch file, you must include the full path to the specified preprocessor within the batch file and not use a referential path.

Example:

c:\ximport\plugins\eTracPreProcessor.exe - would run properly from a batch file

.\eTracPreProcessor.exe - would not run properly from a batch file

PreprocessorPassInstance

Optional in X Dispatch 19.2 or newer

0/1 to enable/disable passing of the database instance to the pre-processor.

Defaults to 0 to not pass the parameter so that it will not cause a regression for any existing preprocessors.

ProcPath

Required

The directory in which to store copies of original import file and the log of XML requests used to place an order.

Example
ProcPath=c:\ximport\processed
Qualifier

Optional

Qualifier defaults to the double-quote character and should almost never be modified. The corner case where Qualifier may need to be specified is if the import file format is a hybrid CSV file where escaped data is surrounded with a character other than the double-quote. This case is exceedingly rare. 

Example
Qualifier="
URL

Required

Full URL to the XMLListener.asp file. 

Example
URL=http://127.0.0.1/XMLListener.asp
BaseNode

Required only if using XML for Routed   

The XML node used to indicate the base node.  

Example
BaseNode=/OrdersXASN
OrderNode

Required only if using XML 

The XML node used to indicate orders.  

If listed, relative to BaseNode.  Otherwise, it is an absolute path from the root of the document. See DATA ITEMS (XML) below. 

Example
OrderNode=/Orders/Order
ParcelNode

Required only if using XML 

The XML node used to indicate parcels. See Data Item Types (XML) below. 

ParcelNode is relative to OrderNode if used in XINI or if BaseNode is specified in routed.  Otherwise, it is an absolute path from the root of the document.

This node supports XPath statements.  These statements are a powerful method to extend the capability of returning data.

Example:

if you wanted the ParcelNode to find all subnodes that start with “D_87”, you could use an XPath statement such as this:

S_MAN[D_88_1="W"]/*[contains(local-name(),'D_87')]

This statement only looks at S_MAN nodes that have a D_88_1 node that is set to “W” and then returns all sub nodes that start with “D_87”.

StripBeginningQualifier

Optional

Used in XML only.

If the XML file submitted is wrapped in leading characters, this will prevent X Import from processing the document.  This is because having leading characters is not valid XMDelimiterL. In order to modify the file into becoming valid XML, you can specify a mask of characters to strip out from the very beginning of the file if they are found.  

This could be more than one character and is not a dynamic field, so do not wrap the value in quotes or begin it with 0,. 

Example
StripBeginningQualifier=”
StripEndingQualifier

Required only if using XML 

If the XML file submitted is wrapped in trailing characters, this will prevent X Import from processing the document.  This is because having trailing characters is not valid XML. In order to modify the file into becoming valid XML, you can specify a mask of characters to strip out from the very end of the file if they are found.  

This could be more than one character and is not a dynamic field, so do not wrap the value in quotes or begin it with 0,. 

Example
StripEndingQualifier=”
SkipLinesIfBeginsWith

Optional

This is only processed for fixed width and delimited imports. It will not be considered in an XML import. 

A regular expression is used to test each line to see if it should be processed.  If the regular expression matches the current line, it will be skipped and not be processed.  Here is a good reference for regular expressions: http://www.regular-expressions.info/reference.html 

Example
^[^"]

That regular expression matches any line that does not begin with a double quote (").  Therefore, any line beginning with something other than a double quote would not be processed by X Import.

LogRetentionDays

Required

Best Practice

Set between 3 and 7 days with the max set to 7 days for Cloud.

The number of days to keep files that are in the ProcPath (.proc, .original) and ErrPath (.err) directories (EdiProcessedPath (.proc, .original) and EdiErrorsPath (.bad) for EDI).  

If this setting is omitted from the configuration file the number of days to keep the files defaults to 90.  If the value in the configuration file cannot be parsed as an Integer the number of days to keep the files defaults to 90. 

Example
LogRetentionDays=7

ReImport

Optional

Used to reimport the processing of a file in the processed directory that does not have a .proc file. 

Example
ReImport=1  # Default is 0 ( no reimport ).
ReportEmail

Optional

The e-mail address to which X Import will send a report about the import process for the current input file. Multiple email addresses should be comma separated. This works for both On Demand and Routed.

Example
ReportEmail=text@cxtsoftware.com

ReportEmail is not stored with the Order, but is used to create an email to be sent. It could be found tblMailLog, tblMailOutbox, or tblMailOutboxHistoric, depending on the time that has passed since the import and the status of email.

MaxReimport

Required

The number of times to attempt to re-import failed imports. Omitting this item or setting it to 0 will have no change on import functionality.  Any numeric value higher than 0 will tell X Import to try the configured number of times to re-import the failed data.

This feature will only re-import failed orders/stops. Any successfully imported orders/stops will not be re-imported. 

If an import fails due to a bad parcel, X Stream will not create the stop or any parcels associated to the stop (even good parcels). A re-import file is generated (regardless to the MaxReimport setting) containing the stop information and only the "good" parcel to re-import (this helps avoid future duplicates if consolidation logic is not used). If the key ReportEmail is in the file, it will trigger an email with the failing reason.

Example
MaxReimport=3
LocalizedDateTimes

Optional

0/1 (disable/enable) this tells the Import process that the date/times are already localized to the stop location in the ASN file, and not to localize them against the server date/time.

Defaults to 0 (disabled) so that it works as it always has.

This allows the user to let the import plugin know that the date/time stamp they are passing in (e.g. "07/27/2018 00:00:00") has already been localized.

If the user specifies a date/time with a timezone offset, the system will make use of that offset for the timezone (e.g. "07/27/2018T00:00:00-04:00").

Example
LocalizedDateTimes=1
Attempts

Optional

Specifies how many times each line is attempted on an import before moving on and throwing an error.

Defaults to 1 if not set. 

If not managed correctly, it could potentially cause substantial processing delays. Before implementing ensure that you fully understand how the element works with the MaxFailed and RetryDelay(ms) to prevent issues with integration processing times.

MaxFailed

Optional

Specifies how many lines in an import can fail to import before the remainder of the file is failed and set for reimport later.

If not set, the import process will continue and all of the failed entries will be put into a new file for reimport. 

If not managed correctly, it could potentially cause substantial processing delays. Before implementing ensure that you fully understand how the element works with the Attempts and RetryDelay(ms) to prevent issues with integration processing times.

Example

If a CSV file has 100 rows, it processes the first 25 successfully, the MaxFailed is set to 15, and the next 15 lines fail, the remaining 60 lines are failed as well. The new file for reimport attempt will have the remaining 75, including the 15 failed entries, and it will be attempted again based on the MaxReimport.

RetryDelay(ms)

Optional

Specifies an amount of time in milliseconds to wait in between failed attempts at importing a line from a file. This is the delay for attempting to process each data set. Delimited files would be the delay for processing each row, XML would be a delay for processing each node based on the Attempts value.

Defaults to 0 if not set.

If not managed correctly, it could potentially cause substantial processing delays. Before implementing ensure that you fully understand how the element works with the Attempts and MaxFailed to prevent issues with integration processing times.

XSLTTranslator

Optional

Contains the file name of the XSLT file to use for translation (e.g. C:\IMPORT\TEST.XSLT). The XSLT file can be used to modify the incoming data before it is processed by the Import plugin.

XSLTDataPath

Optional

Contains the directory location where to write the XSLT translation output data. This is primarily used for debugging purposes. It allows the user to see the data that was created by the XSLT transformation.

XmlDataPath

Optional 

Contains the directory location where to write the output that the Import plugin sent to the back end. This is primarily used for debugging purposes. It allows the user to see what data is being sent to the back end server.

XmlPostAlso

Optional 

Contains a 1 or 0 to enable or disable sending requests to the back end.

This value is used in conjunction with XmlDataPath.

This must be set to 1 for the Import plugin to properly process requests and send them to the back end. It is set to 0 only for debugging the Import process without sending the request to the back end server.

QueueBatchConfigType

Optional

Set to the config type from tblQueue_Batch that will denote import data to processes.

How it is configured:

  1. CXT Software deploys the CXT: Save Web Service String Data plugin. This plugin creates a web based endpoint that writes whatever is sent to it to the tblQueue_Batch table. This plugin is configured to specify the config type, which will be used in the XASN file. 
  2. In the XINI/XASN, set the QueueBatchConfigType key to the same config type as configured in the previous step.

How it works:

  1. A web service calls the endpoint set up above and sends data as part of the body of the request. This data can be XML, JSON, TAB delimited or FIXED length - EXCEL format is NOT supported. The plugin will store the request in tblQueue_Batch using the specified config type.
  2. The import plugin is run and will scan tblQueue_Batch for any records with the given config type. If found, the data from those records will be written to a file in the folder specified by the DataPath parameter. Once all of the records have been written out, the import plugin will process the files as normal.
DelayImport

Optional

Used to delay the processing of a file by a set number of seconds. 

Example
DelayImport=180  # This is for 3 minutes  60*3 = 180
NameSpacePrefix

Optional

Required in some XML inbound documents or if NameSpaceUri is used. 

Example
NameSpacePrefix =ns2  # This data would be found at the root element in the xml doc
NameSpaceUri

Optional

Required in some XML inbound documents or if NameSpacePrefix is used. 

Example
NameSpaceUri =http://internal.amazon.com/TrailerInfo  # This data would be found at the root element in the xml doc
ReplaceContent

Optional 

Used to replace content in the document with the matching value with an empty string. 

Example
ReplaceContent =:TrailerInfo xmlns:“http://www.schema.com”  # This will remove any match for this string to be set to the empty string “”
ReportEmail

Optional 

The e-mail address to which X Import will send a report about the import process for the current input file. Multiple email addresses should be comma separated. This works for both On Demand and Routed.

 Example
ReportEmail=text@cxtsoftware.com

Data: ReportEmail is not stored with the Order, but is used to create an email to be sent. It could be found tblMailLog, tblMailOutbox, or tblMailOutboxHistoric, depending on the time that has passed since the import and the status of email.

File Failures

The import plugin will move any failing data files to the ERRORS folder.

Configuration ItemDescription
FailureEmailFromAddress

Optional

FailureEmailFromAddress is a parameter for a return email address for failure emails. This parameter and FailureEmailFromName must be set the same as the setting in the email account. For example, if the email account has the email as email@cxtsoftware.com and the name as John Doe, the FailureEmailFromAddress must be "email@cxtsoftware.com" and the FailureEmailFrom Name must be "John Doe".

The best practice is to use the same email address as the one set in the active Mail Manager profile. 

FailureEmailToAddress must be set.

The failure emails will be sent if the import fails to pre-process or translate using XSLT the inbound data file. The email will be sent with the data file attached along with the failure message.

FailureEmailFromName

Optional

FailureEmailFromName is a parameter for a return email name for the failure emails. This parameter and FailureEmailFromAddress must be set the same as the setting in the email account. For example, if the email account has the email as email@cxtsoftware.com and the name as John Doe, the FailureEmailFromAddress must be "email@cxtsoftware.com" and the FailureEmailFrom Name must be "John Doe".

The best practice is to use the same email address as the one set in the active Mail Manager profile. 

FailureEmailToAddress must be set.

The failure emails will be sent if the import fails to pre-process or translate using XSLT the inbound data file. The email will be sent with the data file attached along with the failure message.

FailureEmailToAddress

Optional

FailureEmailToAddress is a parameter for a valid email address where the failure email should be sent. 

The failure emails will be sent if the import fails to pre-process or translate using XSLT the inbound data file. The email will be sent with the data file attached along with the failure message.

Example

Example
DataPath=C:\ximport\inbox # example comment
DataMask=XML
Inherit=0
ProcPath=C:\ximport\Processed
ErrPath=C:\ximport\Errors
# another comment
URL=http://127.0.0.1/SampleWeb/XMLListener.asp
Delimiter=XML

Data Item Types 

There are three types of data items: static, dynamic, and macro. 

Static Data Items

Static data items allow you to populate fields with known values. For example, if all of the deliveries sent in on an ASN will be assigned the same order type, origin address, etc. Static data items can be set by prepending the argument with “0,” followed by the static value. 

Example
UserID=0,ximport-mckesson

Dynamic Data Items

Dynamic data items allow you to set an item’s value based upon the type of input file. 

Macro Data Items

Macro data items allow you to populate fields with certain pre-defined macros. Currently, the only supported macros are the following:

NOW

Returns the current timestamp at the time of processing, e.g. “04/20/2008 11:11” AKA "Server Time"  This would be the time of the server on which X Import is running.

Time may be added and subtracted in minute increments by appending the appropriate math after the “NOW” route keyword.

  • “NOW+30” in the example above would be “04/20/2008 11:41”
  • “NOW-30” in the example above would be “04/20/2008 10:41"
TODAY

Returns the current date at the time of processing with the time set to 12:00 AM, e.g. “04/20/2008 00:00” AKA "X Dispatch Time" This is the time in X Dispatch via dbo.GETCXTDate().

Time may be added and subtracted in day (whole or fractional) increments by appending the appropriate math after the “TODAY” keyword

  • “TODAY+2” would evaluate to “04/22/2008 00:00”
  • “TODAY-2” would evaluate to “04/18/2008 00:00”
  • “TODAY+0.5” would evaluate to “04/20/2008 12:00"
CXTNOW

Returns the current date at the time of processing within cxtAsp.

This would be useful if the time zone of the system running X Import does not line up with the time zone of X Dispatch.  For example, if a customer on Cloud is set for Eastern time but the server running their instance is set to Pacific or Central time.  

ReadyTimeFromApplies to the following XINI fields:  

  • ReadyTimeTo
  • PickupTime
  • RequestDeliverTime

Applies to the Following XASN Fields:  

  • PostDate
  • StopTime
  • StopTimeMin
  • StopTimeMax

If CXTNOW is used in a field outside of what is outlined above, the field will contain the results of NOW instead. 

Example

PickupDate= NOW+360
DeliverDate=TODAY+2

Macro data items can also be manipulated to add time offsets. For example, if you receive an ASN for same day delivery and the file comes in at inconsistent times (11 PM one day and 1 AM the next), using the above macros may not be sufficient. For these cases, it is possible to add a time offset and then apply a date transformation.

If you want to accept a same day ASN before 3 PM, but roll everything after 3 PM to the next day, you could do the following:

PickupDate=NOW&09:01|DAYFLOOR

The syntax for this type of macro manipulation is in the following form: Item=Macro&[OffsetHours:]OffsetMinutes|Transformation

Where:

  • Item is a valid Data Item for the configuration file.
  • Macro must be one of: NOW (see NOTE below) or TODAY.
  • OffsetHours specifies the number of hours to add to the evaluated macro (required)
  • OffsetMinutes specifies the number of minutes to add to the evaluated macro (required)
  • Transformation is one of:
    • DAYFLOOR: Set the resulting time to 00:00
    • DAYUPPER: Set the resulting time to 00:00 and adds one day
    • WEEKDAYFLOOR: Behaves similarly to DAYFLOOR. If the time, after adding the OffsetHours and OffsetMinutes, is on a Saturday or Sunday, it will be pushed to the following Monday and then set the resulting time to 00:00 

NOW is a macro that pulls the SERVER time, not the time local to the customer. The server time for Cloud customers is always UTC and the imported files will show the time local to the customer. In the example above, if the customer’s local time is EST you would need to add 4 hours to the syntax example: 

Example
PickupDate=NOW&13:01|DAYFLOOR

  • No labels