File Format (Routed ASN) - *.XASN

The syntax for the routed ASN configuration is slightly different than on demand, however many similarities exist.

The primary difference is that routed ASN configuration files introduce the concept of using two nodes, one to represent the order information and the other for the parcel information (usually rsMaster and rsSecondary, set with the Nodes configuration item.) In the following examples, the node names of rsMaster and rsSecondary are assumed:

Example
Nodes=rsMaster,rsSecondary

General

File Format (General)

Datetime fields must use the format “yyyyMMdd HHmm".

Comments

Comments are indicated by the '#' character and continue to the end of the line and can be at the beginning of a line, or at the end. 

Example
# This is a comment
OriginName=0,CXT Software # This is also a comment

General Configuration Items

Configuration items are set using a Key=Value format. 

Configuration ItemDescription
ConcurrentDataFiles

Optional

This value is set to the number of concurrent data files to be processed and will tell the system how many parallel processes to attempt to create. If this value is set to -1 it will tell the system to create as many as it can.

If this value is NOT set, it will default to 1 to mimic current functionality.

Example
ConcurrentDataFiles=-1
ConcurrentProcesses

Optional

Can be set in the X Stream definition to run multiple XASN/XINI files at the same time.

DataMask

Required

Specifies the extension of the import files (e.g., xml, txt, *).

To disregard the file extension, you may use '*', however, the DataMask item should be set whenever possible to prevent the processing of unintended files. Many times, third-party companies sending the import files will upload the file with a ".tmp" extension and rename the file to the correct extension only once the upload has been completed. If the DataMask is set to '*' these temporary files may be processed before they have been completely transferred.

Example
DataMask=xml
DataPath

Required

Directory to monitor for incoming files.  If the directory is not on the local machine, the fully qualified URI (\\server1\customer1\inbox) must be used. 

Example
DataPath=c:\ximport\inbox
Inherit

Optional

Specifies whether files moved to the “ProcPath” directory should inherit the permissions of that directory when moved.  A value of “1” indicates files should inherit permissions, while a value of “0” indicates the file should retain its current permissions (default behavior). 

Example
Inherit=1
Delimiter

Required

This specifies the type of file to be processed and must be one of:

  • “XML”: The input file is an XML formatted document
  • “0”: The input file is a fixed-width file
  • “TAB”: The input file is tab delimited
  • “,”: The input file is in standard CSV format (See also: "Qualifier" item)
  • Any other character: The input file will be split on this character.
  • "Excel":  This functionality will require the installation of the 32 bit Microsoft Access Database Engine 2010 Redistributable

Example
Delimiter=XML
ErrPath

Required

The directory in which to store any error log files for the import.

Example
ErrPath=c:\ximport\errors
NumLinesSkipped

Optional

Indicates how many lines to skip from the top on an imported flat file. This allows you to bypass header rows in data files. 

Example
NumLinesSkipped=1
PreProcessor

Optional

Full path to the preprocessor executable. 

Example
PreProcessor=c:\ximport\plugins\eTracPreProcessor.exe

If running the preprocessor from a batch file, you must include the full path to the specified preprocessor within the batch file and not use a referential path.

Example:

c:\ximport\plugins\eTracPreProcessor.exe - would run properly from a batch file

.\eTracPreProcessor.exe - would not run properly from a batch file

PreprocessorPassInstance

Optional in X Dispatch 19.2 or newer

0/1 to enable/disable passing of the database instance to the pre-processor.

Defaults to 0 to not pass the parameter so that it will not cause a regression for any existing preprocessors.

ProcPath

Required

The directory in which to store copies of original import file and the log of XML requests used to place an order.

Example
ProcPath=c:\ximport\processed
Qualifier

Optional

Qualifier defaults to the double-quote character and should almost never be modified. The corner case where Qualifier may need to be specified is if the import file format is a hybrid CSV file where escaped data is surrounded with a character other than the double-quote. This case is exceedingly rare. 

Example
Qualifier="
URL

Required

Full URL to the XMLListener.asp file. 

Example
URL=http://127.0.0.1/XMLListener.asp
BaseNode

Required only if using XML for Routed   

The XML node used to indicate the base node.  

Example
BaseNode=/OrdersXASN
OrderNode

Required only if using XML 

The XML node used to indicate orders.  

If listed, relative to BaseNode.  Otherwise, it is an absolute path from the root of the document. See DATA ITEMS (XML) below. 

Example
OrderNode=/Orders/Order
ParcelNode

Required only if using XML 

The XML node used to indicate parcels. See Data Item Types (XML) below. 

ParcelNode is relative to OrderNode if used in XINI or if BaseNode is specified in routed.  Otherwise, it is an absolute path from the root of the document.

This node supports XPath statements.  These statements are a powerful method to extend the capability of returning data.

Example:

if you wanted the ParcelNode to find all subnodes that start with “D_87”, you could use an XPath statement such as this:

S_MAN[D_88_1="W"]/*[contains(local-name(),'D_87')]

This statement only looks at S_MAN nodes that have a D_88_1 node that is set to “W” and then returns all sub nodes that start with “D_87”.

StripBeginningQualifier

Optional

Used in XML only.

If the XML file submitted is wrapped in leading characters, this will prevent X Import from processing the document.  This is because having leading characters is not valid XMDelimiterL. In order to modify the file into becoming valid XML, you can specify a mask of characters to strip out from the very beginning of the file if they are found.  

This could be more than one character and is not a dynamic field, so do not wrap the value in quotes or begin it with 0,. 

Example
StripBeginningQualifier=”
StripEndingQualifier

Required only if using XML 

If the XML file submitted is wrapped in trailing characters, this will prevent X Import from processing the document.  This is because having trailing characters is not valid XML. In order to modify the file into becoming valid XML, you can specify a mask of characters to strip out from the very end of the file if they are found.  

This could be more than one character and is not a dynamic field, so do not wrap the value in quotes or begin it with 0,. 

Example
StripEndingQualifier=”
SkipLinesIfBeginsWith

Optional

This is only processed for fixed width and delimited imports. It will not be considered in an XML import. 

A regular expression is used to test each line to see if it should be processed.  If the regular expression matches the current line, it will be skipped and not be processed.  Here is a good reference for regular expressions: http://www.regular-expressions.info/reference.html 

Example
^[^"]

That regular expression matches any line that does not begin with a double quote (").  Therefore, any line beginning with something other than a double quote would not be processed by X Import.

LogRetentionDays

Required

Best Practice

Set between 3 and 7 days with the max set to 7 days for Cloud.

The number of days to keep files that are in the ProcPath (.proc, .original) and ErrPath (.err) directories (EdiProcessedPath (.proc, .original) and EdiErrorsPath (.bad) for EDI).  

If this setting is omitted from the configuration file the number of days to keep the files defaults to 90.  If the value in the configuration file cannot be parsed as an Integer the number of days to keep the files defaults to 90. 

Example
LogRetentionDays=7

ReImport

Optional

Used to reimport the processing of a file in the processed directory that does not have a .proc file. 

Example
ReImport=1  # Default is 0 ( no reimport ).
ReportEmail

Optional

The e-mail address to which X Import will send a report about the import process for the current input file. Multiple email addresses should be comma separated. This works for both On Demand and Routed.

Example
ReportEmail=text@cxtsoftware.com

ReportEmail is not stored with the Order, but is used to create an email to be sent. It could be found tblMailLog, tblMailOutbox, or tblMailOutboxHistoric, depending on the time that has passed since the import and the status of email.

MaxReimport

Required

The number of times to attempt to re-import failed imports. Omitting this item or setting it to 0 will have no change on import functionality.  Any numeric value higher than 0 will tell X Import to try the configured number of times to re-import the failed data.

This feature will only re-import failed orders/stops. Any successfully imported orders/stops will not be re-imported. 


If an import fails due to a bad parcel, X Stream will not create the stop or any parcels associated to the stop (even good parcels). A re-import file is generated (regardless to the MaxReimport setting) containing the stop information and only the "good" parcel to re-import (this helps avoid future duplicates if consolidation logic is not used). If the key ReportEmail is in the file, it will trigger an email with the failing reason.

Example
MaxReimport=3
LocalizedDateTimes

Optional

0/1 (disable/enable) this tells the Import process that the date/times are already localized to the stop location in the ASN file, and not to localize them against the server date/time.

Defaults to 0 (disabled) so that it works as it always has.

This allows the user to let the import plugin know that the date/time stamp they are passing in (e.g. "07/27/2018 00:00:00") has already been localized.

If the user specifies a date/time with a timezone offset, the system will make use of that offset for the timezone (e.g. "07/27/2018T00:00:00-04:00").

Example
LocalizedDateTimes=1

Possible version compatibility issue. See Changelog below. 

Attempts

Optional

Specifies how many times each line is attempted on an import before moving on and throwing an error.

Defaults to 1 if not set. 

If not managed correctly, it could potentially cause substantial processing delays. Before implementing ensure that you fully understand how the element works with the MaxFailed and RetryDelay(ms) to prevent issues with integration processing times.

Possible version compatibility issue. See Changelog below.

MaxFailed

Optional

Specifies how many lines in an import can fail to import before the remainder of the file is failed and set for reimport later.

If not set, the import process will continue and all of the failed entries will be put into a new file for reimport. 

If not managed correctly, it could potentially cause substantial processing delays. Before implementing ensure that you fully understand how the element works with the Attempts and RetryDelay(ms) to prevent issues with integration processing times.

Example

If a CSV file has 100 rows, it processes the first 25 successfully, the MaxFailed is set to 15, and the next 15 lines fail, the remaining 60 lines are failed as well. The new file for reimport attempt will have the remaining 75, including the 15 failed entries, and it will be attempted again based on the MaxReimport.

Possible version compatibility issue. See Changelog below.

RetryDelay(ms)

Optional

Specifies an amount of time in milliseconds to wait in between failed attempts at importing a line from a file. This is the delay for attempting to process each data set. Delimited files would be the delay for processing each row, XML would be a delay for processing each node based on the Attempts value.

Defaults to 0 if not set.

If not managed correctly, it could potentially cause substantial processing delays. Before implementing ensure that you fully understand how the element works with the Attempts and MaxFailed to prevent issues with integration processing times.

Possible version compatibility issue. See Changelog below.

XSLTTranslator

Optional

Contains the file name of the XSLT file to use for translation (e.g. C:\IMPORT\TEST.XSLT). The XSLT file can be used to modify the incoming data before it is processed by the Import plugin.

XSLTDataPath

Optional

Contains the directory location where to write the XSLT translation output data. This is primarily used for debugging purposes. It allows the user to see the data that was created by the XSLT transformation.

XmlDataPath

Optional 

Contains the directory location where to write the output that the Import plugin sent to the back end. This is primarily used for debugging purposes. It allows the user to see what data is being sent to the back end server.

XmlPostAlso

Optional 

Contains a 1 or 0 to enable or disable sending requests to the back end.

This value is used in conjunction with XmlDataPath.

This must be set to 1 for the Import plugin to properly process requests and send them to the back end. It is set to 0 only for debugging the Import process without sending the request to the back end server.

QueueBatchConfigType

Optional

Set to the config type from tblQueue_Batch that will denote import data to processes.

How it is configured:

  1. CXT Software deploys the CXT: Save Web Service String Data plugin. This plugin creates a web based endpoint that writes whatever is sent to it to the tblQueue_Batch table. This plugin is configured to specify the config type, which will be used in the XASN file. 
  2. In the XINI/XASN, set the QueueBatchConfigType key to the same config type as configured in the previous step.

How it works:

  1. A web service calls the endpoint set up above and sends data as part of the body of the request. This data can be XML, JSON, TAB delimited or FIXED length - EXCEL format is NOT supported. The plugin will store the request in tblQueue_Batch using the specified config type.
  2. The import plugin is run and will scan tblQueue_Batch for any records with the given config type. If found, the data from those records will be written to a file in the folder specified by the DataPath parameter. Once all of the records have been written out, the import plugin will process the files as normal.

Possible version compatibility issue. See Changelog below.

DelayImport

Optional

Used to delay the processing of a file by a set number of seconds. 

Example
DelayImport=180  # This is for 3 minutes  60*3 = 180
NameSpacePrefix

Optional

Required in some XML inbound documents or if NameSpaceUri is used. 

Example
NameSpacePrefix =ns2  # This data would be found at the root element in the xml doc
NameSpaceUri

Optional

Required in some XML inbound documents or if NameSpacePrefix is used. 

Example
NameSpaceUri =http://internal.amazon.com/TrailerInfo  # This data would be found at the root element in the xml doc
ReplaceContent

Optional 

Used to replace content in the document with the matching value with an empty string. 

Example
ReplaceContent =:TrailerInfo xmlns:“http://www.schema.com”  # This will remove any match for this string to be set to the empty string “”
ReportEmail

Optional 

The e-mail address to which X Import will send a report about the import process for the current input file. Multiple email addresses should be comma separated. This works for both On Demand and Routed.

 Example
ReportEmail=text@cxtsoftware.com

Data: ReportEmail is not stored with the Order, but is used to create an email to be sent. It could be found tblMailLog, tblMailOutbox, or tblMailOutboxHistoric, depending on the time that has passed since the import and the status of email.

File Failures

The import plugin will move any failing data files to the ERRORS folder.

Possible version compatibility issue. See Changelog below.

Configuration ItemDescription
FailureEmailFromAddress

Optional

FailureEmailFromAddress is a parameter for a return email address for failure emails. This parameter and FailureEmailFromName must be set the same as the setting in the email account. For example, if the email account has the email as email@cxtsoftware.com and the name as John Doe, the FailureEmailFromAddress must be "email@cxtsoftware.com" and the FailureEmailFrom Name must be "John Doe".

The best practice is to use the same email address as the one set in the active Mail Manager profile. 

FailureEmailToAddress must be set.

The failure emails will be sent if the import fails to pre-process or translate using XSLT the inbound data file. The email will be sent with the data file attached along with the failure message.

FailureEmailFromName

Optional

FailureEmailFromName is a parameter for a return email name for the failure emails. This parameter and FailureEmailFromAddress must be set the same as the setting in the email account. For example, if the email account has the email as email@cxtsoftware.com and the name as John Doe, the FailureEmailFromAddress must be "email@cxtsoftware.com" and the FailureEmailFrom Name must be "John Doe".

The best practice is to use the same email address as the one set in the active Mail Manager profile. 

FailureEmailToAddress must be set.

The failure emails will be sent if the import fails to pre-process or translate using XSLT the inbound data file. The email will be sent with the data file attached along with the failure message.

FailureEmailToAddress

Optional

FailureEmailToAddress is a parameter for a valid email address where the failure email should be sent. 

The failure emails will be sent if the import fails to pre-process or translate using XSLT the inbound data file. The email will be sent with the data file attached along with the failure message.

Example

Example
DataPath=C:\ximport\inbox # example comment
DataMask=XML
Inherit=0
ProcPath=C:\ximport\Processed
ErrPath=C:\ximport\Errors
# another comment
URL=http://127.0.0.1/SampleWeb/XMLListener.asp
Delimiter=XML

Data Item Types 

There are three types of data items: static, dynamic, and macro. 

Static Data Items

Static data items allow you to populate fields with known values. For example, if all of the deliveries sent in on an ASN will be assigned the same order type, origin address, etc. Static data items can be set by prepending the argument with “0,” followed by the static value. 

Example
UserID=0,ximport-mckesson

Dynamic Data Items

Dynamic data items allow you to set an item’s value based upon the type of input file. 

Macro Data Items

Macro data items allow you to populate fields with certain pre-defined macros. Currently, the only supported macros are the following:

NOW

Returns the current timestamp at the time of processing, e.g. “04/20/2008 11:11” AKA "Server Time"  This would be the time of the server on which X Import is running.

Time may be added and subtracted in minute increments by appending the appropriate math after the “NOW” route keyword.

  • “NOW+30” in the example above would be “04/20/2008 11:41”
  • “NOW-30” in the example above would be “04/20/2008 10:41"
TODAY

Returns the current date at the time of processing with the time set to 12:00 AM, e.g. “04/20/2008 00:00” AKA "X Dispatch Time" This is the time in X Dispatch via dbo.GETCXTDate().

Time may be added and subtracted in day (whole or fractional) increments by appending the appropriate math after the “TODAY” keyword

  • “TODAY+2” would evaluate to “04/22/2008 00:00”
  • “TODAY-2” would evaluate to “04/18/2008 00:00”
  • “TODAY+0.5” would evaluate to “04/20/2008 12:00"
CXTNOW

Returns the current date at the time of processing within cxtAsp.

This would be useful if the time zone of the system running X Import does not line up with the time zone of X Dispatch.  For example, if a customer on Cloud is set for Eastern time but the server running their instance is set to Pacific or Central time.  

ReadyTimeFromApplies to the following XINI fields:  

  • ReadyTimeTo
  • PickupTime
  • RequestDeliverTime

Applies to the Following XASN Fields:  

  • PostDate
  • StopTime
  • StopTimeMin
  • StopTimeMax

If CXTNOW is used in a field outside of what is outlined above, the field will contain the results of NOW instead. 

Example

PickupDate= NOW+360
DeliverDate=TODAY+2

Macro data items can also be manipulated to add time offsets. For example, if you receive an ASN for same day delivery and the file comes in at inconsistent times (11 PM one day and 1 AM the next), using the above macros may not be sufficient. For these cases, it is possible to add a time offset and then apply a date transformation.

If you want to accept a same day ASN before 3 PM, but roll everything after 3 PM to the next day, you could do the following:

PickupDate=NOW&09:01|DAYFLOOR

The syntax for this type of macro manipulation is in the following form: Item=Macro&[OffsetHours:]OffsetMinutes|Transformation

Where:

  • Item is a valid Data Item for the configuration file.
  • Macro must be one of: NOW (see NOTE below) or TODAY.
  • OffsetHours specifies the number of hours to add to the evaluated macro (required)
  • OffsetMinutes specifies the number of minutes to add to the evaluated macro (required)
  • Transformation is one of:
    • DAYFLOOR: Set the resulting time to 00:00
    • DAYUPPER: Set the resulting time to 00:00 and adds one day
    • WEEKDAYFLOOR: Behaves similarly to DAYFLOOR. If the time, after adding the OffsetHours and OffsetMinutes, is on a Saturday or Sunday, it will be pushed to the following Monday and then set the resulting time to 00:00 

NOW is a macro that pulls the SERVER time, not the time local to the customer. The server time for Cloud customers is always UTC and the imported files will show the time local to the customer. In the example above, if the customer’s local time is EST you would need to add 4 hours to the syntax example: 

Example
PickupDate=NOW&13:01|DAYFLOOR

Data Items

Data items are set using a Node|Key=Value format.

The syntax of Value varies based on the Delimiter item (i.e., whether the input file is XML, delimited, or fixed-width). 

Data ItemDescription
rsMaster|EncryptedPassword

Required

Specifies whether the X Dispatch Internet User's password has been encrypted. Most of the time, this should be set to true.

Example
rsMaster|EncryptedPassword=0,1
rsMaster|Password

Required

Password for the X Dispatch Internet User used for placing orders. 

Example
rsMaster|Password=0,testpassword
rsMaster|UserID

Required

X Dispatch Internet User to use for placing the orders. 

Example
rsMaster|UserID=0,ximport-mckesso
rsMaster|CustID

Required

Customer ID used to place the order. 

Example
rsMaster|CustID=0,1218

Route Stop Details

Data ItemDescription
rsMaster|Address

Required

The address to use for the current route stop.

rsMaster|Address2

Required

The address line 2 to use for the current route stop.

rsMaster|BOL

Optional

The BOL (Bill of Lading) data to which route stops may be associated.  

Maximum length = 50. 

Example
rsMaster|BOL=0,ABC123geo
rsMaster|City

Required

The city to use for the current route stop.

rsMaster|CustRouteID

Optional

The primary reference for looking up route translations (See Configuration: X Import Translations.)

rsMaster|CustSecondaryLookup

Optional

The secondary reference for looking up route translations (See Configuration: X Import Translations.)

rsMaster|RouteSortRuleID

Optional

If a fixed RouteID is not set, and CustRouteID/CustSecondary lookup is not utilized or fails to find a match, the database tries to find the lowest OrderTypeID that contains a destination sort rule and uses that sort rule to try and determine the route. 

This functionality makes use of the “Auto assign pickup to route by rule” and “Auto assign delivery to route by rule” options on the Order Type edit form in X Dispatch.

RouteSortRuleID allows you to explicitly set the route sort rule used to set the route in that case.  This means you can use an existing route sort rule instead of setting up translations or can set a specific sort rule to use as a fallback should the route translations fail.

rsMaster|OverrideGLAccountID

Optional

On a dynamic route stop, it is possible to set a specific GL account to assign to that stop.  In order to override the GL account for an imported route stop, set this optional field to the ID of the GL Account you wish to use as the override.

rsMaster|RateChartOverrideLookup

Optional

This value sets the rate chart override lookup value, to be checked against values in the X Stream Route Mappings section. If the values match, the Translation field of the route mappings will be used as the stamp rate chart ID for rating all items on that route stop, rather than the customer’s default stamp rate chart ID.

rsMaster|RateChartSecondaryLookup

Optional

This value sets the rate chart secondary override lookup value, to be checked against values in the X Stream Route Mappings section’s Reference2 field, and if the values match, the Translation field of the route mappings will be used as the stamp rate chart ID for rating all items on that route stop, rather than the customer’s default stamp rate chart ID. If specified, this will be required in order to match. 

This does not work with a hardcoded value in the .XASN configuration.

rsMaster|RateChartSecondaryLookup

Optional

Whether address should be geocoded to retrieve latitude and longitude, as well as validate the postal code up to 9 digits. This should generally be set to true. 

Example
rsMaster|GeoCode=0,1
rsMaster|Name

Required

The name to use for the imported route stop.

rsMaster|NodeName

Required

The master node name should always be set to “ProcessRouteASN” 

Example
rsMaster|NodeName=0,ProcessRouteASN
rsMaster|Pieces

Optional

The number of pieces on the order. If this is not specified or is less than the number of parcels sent in for the route stop, the piece count will be set by the number of parcels specified in the request

rsMaster|PK

Required

The PK field sets the primary key for linking the rsMaster and rsSecondary data sets. rsMaster|PK should match rsSecondary|FK. The best fields to use for the primary key will generally be the original tracking or order number from the ASN.

rsMaster|Plus4

Optional

The 4-digit postal code extension for the stop.

rsMaster|PostDate

Required

The date on which this route stop should be posted.

rsMaster|Reference1

Optional

User-defined reference field 1.

rsMaster|Reference2

Optional

User-defined reference field 2.

rsMaster|RouteID

Optional

The route ID onto which the route stops should be placed (See Configuration: X Import Translations, File Format (Routed ASN) - *.XASN#CustRouteID, File Format (Routed ASN) - *.XASN#CustSecondaryLookup.)

rsMaster|Sequence

Optional

The sequence number for the route stops. Generally, it is better to leave this item out of your configuration file and let the stops be auto sequenced on import. As such, if you wish for stops to sequence on your specified value here, then you must disable auto sequencing.

rsMaster|State

Required

The state to use for the imported route stop.

rsMaster|StopNotes

Optional

Stop-specific notes (e.g., “Dock 12”)

rsMaster|StopTime

Optional

The time at which this stop should be scheduled. 

This field can be set to “0,” which will set it to blank value and utilize the stop time preferred mappings or the current system time (at time of import).

rsMaster|StopTimeFallback

Optional

The time at which this stop should be scheduled if the StopTime set to “0,” which will set it to a blank value. 

If this field is not set, then the system will use the current system time (at time of import).

rsMaster|StopType

Optional

The stop type. Defaults to 1 (Delivery) if not specified or if an invalid stop type (anything other than 0-6) is specified.

rsMaster|Weight

Optional

The net weight of all parcels on this route stop. If this field is not specified, the cumulative weight of all parcels sent in will be used instead.

rsMaster|Length

Length used in dimensional weight calculator.

Data: ParcelLength stored in tblOrderRouteStop. It is displayed on the dimensional weight calculator on the route stop form as Length.

Possible version compatibility issue. See Changelog below.

rsMaster|Width

Width used in dimensional weight calculator.

Data: ParcelWidth stored in tblOrderRouteStop. It is displayed on the dimensional weight calculator on the route stop form as Width.

Possible version compatibility issue. See Changelog below.

rsMaster|Height

Height used in dimensional weight calculator.

Data: ParcelHeight stored in tblOrderRouteStop. It is displayed on the dimensional weight calculator on the route stop form as Height.

Possible version compatibility issue. See Changelog below.

rsMaster|DimWtDenominator

Dimensional weight denominator.

DataDimWtDenominator is stored in tblOrderRouteStop as DimWtDenominator. It is displayed on the dimensional weight calculator on the route stop form as Denominator.

Possible version compatibility issue. See Changelog below.

rsMaster|UseCalculatedDimensionalWeight

A flag if the Length, Width, and Height values are set to tell the import to calculate the Dimensional Weight.

Possible version compatibility issue. See Changelog below.

rsMaster|Zip

Required

The postal code of the route stop.Secondary

rsMaster|StopTimeMin

Optional

Min stop time.

rsMaster|StopTimeMax

Optional

Max stop time.

rsMaster|ShortRemarks

Optional

Short remarks.

rsMaster|LongRemarks

Optional

Long remarks.

rsMaster|EmailPod

Optional

Email pod.

rsMaster|XMobileSecondaryLookup

Optional

Secondary lookup for X Mobile.

rsMaster|ForceLocationScan

Optional

Force Location Barcode (Mobile Secondary Lookup) checkbox, for Nextstop workflow. 

Example
rsMaster|ForceLocationScan=0,true

When configured as true, this feature sets the driver requirement to perform a location scan for the route stop.

rsMaster|CustomerRouteID

Optional

Customer route ID.

rsMaster|CustomerStopID

Optional

Customer stop ID.

rsMaster|CustomerAccountID

Optional

Customer account ID.

rsMaster|RouteOverride

Optional

Route pay override. 

Example
rsMaster|RouteOverride=0,5.00

The following will add an Override Settlement of $0.00 if LD is present in the S502 element but if not, will not function at all. Replace the 0 with a dollar amount if you want to include an Else scenario: rsMaster|RouteOverride=IIF(S502;"LD";"0.00";0)The following will add an Override Settlement of $0.00 if LD is present in the S502 element but if not, will not function at all. Replace the 0 with a dollar amount if you want to include an Else scenario: rsMaster|RouteOverride=IIF(S502;"LD";"0.00";0)

rsMaster|AutoSequence

Optional

Flag that determines if the route stops should be auto sequenced. True value is '1', False value is '0'. Defaults to '1' if the value is omitted from the file.

rsMaster|AdvancedFallbackLookup1

Optional

Data: AdvancedFallbackLookup1 is stored in tblOrderRouteStops in the AdvancedFallbackLookup1 column. It is the ALF1 field under the miscellaneous tab in X Dispatch.

rsMaster|AdvancedFallbackLookup2

Optional

Data: AdvancedFallbackLookup2 is stored in tblOrderRouteStops in the AdvancedFallbackLookup2 column. It is the ALF2 field under the miscellaneous tab in X Dispatch.

rsMaster|RateChartOverrideLookup

Optional

Data: RateChartOverrideLookup is stored in tblOrderRouteStops in the RateChartOverrideLookup column. It is the Rate Chart Lookup under field under the miscellaneous tab in X Dispatch.

rsMaster|ConsolidateStops

Optional

Flag determines if stops Consolidate. Set to '1' to turn on consolidation.  Intended to consolidate stops from separate ASN files that would have consolidated if they were contained in the same ASN file.  RouteID, PostDate, CustID, Address, City, State, Zip are used to identify stop to consolidate with. This searches any route for consolidation keys newer than 30 days from current date. The behavior will allow stops to be created and NOT consolidate.Mapping

Please note that this feature only works if the RouteID is being passed in from the ASN directly.  If Route Mappings and/or Route Sort Rules are utilized, the Consolidation will fail. 

Example
rsMaster|ConsolidateStops=0,1
rsMaster|ConsolidationText

Optional

Intended to consolidate stops between ASN files. Use any string to set consolidation value - can be at any level. 

Example
rsMaster|ConsolidationText=230,6;236,6 # fixed width file: column 230, length=6 and column 236, length = 6. rsMaster|ConsolidationText=1 # eg: 1 = column in csv file
rsMaster|ConsolidationText=APPEND(18,NOW&amp;08:00|DAYFLOOR);APPEND(5,&quot; | &quot;);APPEND(6,230,12) # date and values (12 characters starting at position 230) separated by a pipe</p>

ConsolidationText creates record in database in cxtData.dbo.tblOrderRouteStops_Consolidated. ConsolidationText should include the date since the consolidation text will try to look up anything that matches within the last 30 days.

rsMaster|ODSyncField[XINI field name]

Optional

The following 'ODSyncField' fields are available to match for synchronization with on demand orders.  Each field is optional.  If a match is found the stop is not imported.  The field name is 'rsMaster|ODSyncField' prepended to the XINI field name found in the on demand section of this document. 

Example
rsMaster|ODSyncFieldCustID
rsMaster|ODSyncFieldBillingGroup
rsMaster|ODSyncFieldBillOfLading
rsMaster|ODSyncFieldDeliverDate
rsMaster|ODSyncFieldDestAddress
rsMaster|ODSyncFieldDestCity
rsMaster|ODSyncFieldDestComments
rsMaster|ODSyncFieldDestContactPhone
rsMaster|ODSyncFieldDestName
rsMaster|ODSyncFieldDestPlus4
rsMaster|ODSyncFieldDestState
rsMaster|ODSyncFieldDestSuite
rsMaster|ODSyncFieldDestZip
rsMaster|ODSyncFieldEmailOrderConfirmationChecked
rsMaster|ODSyncFieldEmailPOD
rsMaster|ODSyncFieldEmailPODChecked
rsMaster|ODSyncFieldEmailPOPChecked
rsMaster|ODSyncFieldMasterBillOfLading
rsMaster|ODSyncFieldOrderType
rsMaster|ODSyncFieldOriginAddress
rsMaster|ODSyncFieldOriginCity
rsMaster|ODSyncFieldOriginComments
rsMaster|ODSyncFieldOriginName
rsMaster|ODSyncFieldOriginPlus4
rsMaster|ODSyncFieldOriginState
rsMaster|ODSyncFieldOriginSuite
rsMaster|ODSyncFieldOriginZip
rsMaster|ODSyncFieldOverrideDeliveryTimeFrom
rsMaster|ODSyncFieldOverrideDeliveryTimeTo
rsMaster|ODSyncFieldOverrideReadyTimeFrom
rsMaster|ODSyncFieldOverrideReadyTimeTo
rsMaster|ODSyncFieldParcelPieces
rsMaster|ODSyncFieldParcelWeight
rsMaster|ODSyncFieldPickupDate
rsMaster|ODSyncFieldReference1
rsMaster|ODSyncFieldReference2
rsMaster|ODSyncFieldReportEmail
rsMaster|ODSyncFieldServiceType
rsMaster|ODSyncFieldSpecialInst
rsMaster|ODSyncFieldUserField0
rsMaster|ODSyncFieldUserField1
rsMaster|ODSyncFieldUserField2
rsMaster|ODSyncFieldUserField3
rsMaster|ODSyncFieldUserField4
rsMaster|ODSyncFieldUserField5
rsMaster|ODSyncFieldUserField6
rsMaster|ODSyncFieldUserField7
rsMaster|ODSyncFieldUserField8
rsMaster|ODSyncFieldUserField9
rsMaster|ODSyncFieldUserField10
rsMaster|ODSyncFieldUserField11
rsMaster|ODSyncFieldUserField12
rsMaster|ValidationFailOver

Optional

When used, returns at the minimum the specified value information when the address does not validate as an exact match.

Valid types are:

  1. LatLonMin - return the new lat/long coordinates only. (keeps old address.) 

    The geocode bit needs to be turned on for the LatLonMin ValidationFailOver process to return lat/lon coordinates.

    rsMaster|GeoCode=0,1

  2. AddressMin - return the new address only. (keeps old lat/long data.)
Example
rsMaster|ValidationFailOver=0,LatLonMin
rsMaster|MiscData|[...]

Optional

This is a dynamic field, which will store any amount of data as properties in a JSON object in the database. The [...] above can be replaced with any string and used as many times as needed. Data can be queried using a function that engineering has created. 

MiscData fields do not use the BaseNode, OrderNode, or ParcelNode when importing data from an XML ASN. The full XML path will need to be provided for each MiscData field.

Examples

Using the following in a XASN will store data as a property in a JSON object. 

rsMaster|MiscData|testfield1=0,sampledata

Execute

SELECT dbo.ParseJSONField(MiscData, 'testfield1') AS testfield1 FROM tblAdditionalData

Returns

“sampledata”


Using the following in a XASN will store data as a property in a JSON object. 

rsMaster|MiscData|IntegrationData1=0,SampleA rsMaster|MiscData|IntegrationData2=0,SampleB rsMaster|MiscData|IntegrationData3=0,SampleC

Execute

SELECT
dbo.ParseJSONField(MiscData, 'IntegrationData1’) AS [Column1]
dbo.ParseJSONField(MiscData, 'IntegrationData2’) AS [Column2]
dbo.ParseJSONField(MiscData, 'IntegrationData3’) AS [Column3]
FROM tblAdditionalData


Accessorials

Data ItemDescription
rsMaster|Accessorial1ID

Optional 

Used to include accessorial items imported from an ASN via XASN. This will be checked against the assigned rate chart and the rate will populate on the route stop. Maps to MiscItem1 on tblOrderRouteStops. 

If you specify an ItemID here, you must specify a quantity for the item in rsMaster|Accessorial1Quantity. 

Example
rsMaster|Accessorial1ID = 8

Possible version compatibility issue. See Changelog below.

rsMaster|Accessorial1Quantity

Optional

Used in conjunction with rsMaster|Accessorial1ID to specify the quantity for the items imported from an ASN via XASN. This will be checked against the assigned rate chart and the rate will populate on the route stop. Maps to MiscQty1 on tblorderRouteStops. 

If you specify a quantity here, you must specify an ItemID for the item in rsMaster|Accessorial1ID

Example
rsMaster|Accessorial1Quantity = 1

Possible version compatibility issue. See Changelog below.

rsMaster|Accessorial2ID

Optional

Used to include accessorial items imported from an ASN via XASN. This will be checked against the assigned rate chart and the rate will populate on the route stop. Maps to MiscItem2 on tblOrderRouteStops. 

If you specify an ItemID here, you must specify a quantity for the item in rsMaster|Accessorial2Quantity. 

Example
rsMaster|Accessorial2ID=0,10

Possible version compatibility issue. See Changelog below.

rsMaster|Accessorial2Quantity

Optional

Used in conjunction with rsMaster|Accessorial2ID to specify the quantity for the items imported from an ASN via XASN. This will be checked against the assigned rate chart and the rate will populate on the route stop. Maps to MiscQty2 on tblorderRouteStops. 

If you specify a quantity here, you must specify an ItemID for the item in rsMaster|Accessorial2ID

Example
rsMaster|Accessorial2Quantity = 0,4

Possible version compatibility issue. See Changelog below.

User Field

Data ItemDescription
rsMaster|UserField1

Optional

User data field 1.

rsMaster|UserField2

Optional

User data field 2.

rsMaster|UserField3

Optional

User data field 3.

rsMaster|UserField4

Optional

User data field 4.

rsMaster|UserField5

Optional

User data field 5.

rsMaster|UserField6

Optional

User data field 6.

Secondary

Data ItemDescription
rsSecondary|FK

Required

The FK field sets the foreign key for linking the rsMaster and rsSecondary data sets. rsMaster|PK should match rsSecondary|FK. The best fields to use for the primary key will generally be the original tracking or order number from the ASN. 

Quick Tip

This value might exist above the location of the parcel node.  If this is the case, you can use the “../” notation to pull data from a node that exists above the parcel node.

rsSecondary|NodeName

Required

The secondary node name should always be set to “Parcel”. 

Example
rsSecondary|NodeName=0,Parcel
rsSecondary|ParcelPieces

Optional

The number of pieces or items contained within this parcel.

rsSecondary|ParcelReference

Optional

Parcel-level barcode.

rsSecondary|ParcelType

Optional

The carrier-defined parcel type to use for the imported parcel.  The ParcelType text can be the numeric ParcelTypeID from tblParcelTypes or the Description from tblParcelTypes - Description performs a wildcard search and used the first ParcelTypeID found where the Description contains the ParcelType text.

rsSecondary|ParcelWeight

Optional

The weight associated with this parcel.

rsSecondary|PK

Required

The PK field on the secondary node is used when aggregating parcels onto route stops. Generally, this item should be set to a unique parcel-level barcode, or a combination of route, stop, or any other uniquely identifying information.

rsSecondary|ParcelDescription

Optional

Description of the parcel.

rsSecondary|ParcelReference2

Optional

Secondary reference field for the parcel.

rsSecondary|ParcelMasterBarcode

Optional

Secondary reference field for the parcel. This sets the Parcel Consolidation Barcode.

rsSecondary|ParcelConsolidationBarcode

Optional

This sets the Parcel Consolidation Barcode 2

rsSecondary|MiscData|[...]

Optional

This is a dynamic field, which will store any amount of data as properties in a JSON object in the database. The [...] can be replaced with any string and used as many times as needed. Data can be queried using a function that engineering has created. 

MiscData fields do not use the BaseNode, OrderNode, or ParcelNode when importing data from an XML ASN. The full XML path will need to be provided for each MiscData field.

Examples

Using the following in a XASN will store data as a property in a JSON object. 

rsSecondary|MiscData|testfield1=0,sampledata

Execute

SELECT dbo.ParseJSONField(MiscData, 'testfield1') AS testfield1 FROM tblAdditionalData

Returns

“sampledata”


Using the following in a XASN will store data as a property in a JSON object. 

rsSecondary|MiscData|IntegrationData1=0,SampleA
rsSecondary|MiscData|IntegrationData2=0,SampleB
rsSecondary|MiscData|IntegrationData3=14

Execute

SELECT
dbo.ParseJSONField(MiscData, 'IntegrationData1’) AS [Column1]
dbo.ParseJSONField(MiscData, 'IntegrationData2’) AS [Column2]
dbo.ParseJSONField(MiscData, 'IntegrationData3’) AS [Column3]
FROM tblAdditionalData

Example

DataPath=E:\ximport\CardinalOD
DataMask=XML
ProcPath=e:\ximport\Processed
ErrPath=e:\ximport\Error
URL=http://127.0.0.1/XMLListener.asp
Delimiter=XML
NumLinesSkipped=0
 
Nodes=rsMaster,rsSecondary
 
rsMaster|UserID=0,test
rsMaster|Password=0,TESTTEST
rsMaster|EncryptedPassword=0,1
rsMaster|CustID=0,1218
rsMaster|CustRouteID=230,6
rsMaster|CustSecondaryLookup=0,x
rsMaster|ContractID=0,69290
rsMaster|NodeName=0,ProcessRouteASN
rsMaster|PK=Route$RouteNumber,Route/Stop$StopNumber,Route/Stop/Customer$CustomerNumber
rsMaster|BaseNode=/CardinalDistributionShipmentFile/Shipment
rsMaster|Address=Route/Stop/Customer$CustomerAddress
rsMaster|City=Route/Stop/Customer$CustomerCity
rsMaster|GeoCode=0,1
rsMaster|Name=Route/Stop/Customer$CustomerName1
rsMaster|PostDate=Route/Stop/Customer$InvoiceDate
rsMaster|Reference1=Route/Stop/Customer$CustomerNumber
rsMaster|State=Route/Stop/Customer$CustomerState
rsMaster|StopNotes=Route/Stop/Customer$CustomerName2
rsMaster|StopTime=0,12:00
rsMaster|Zip=Route/Stop/Customer$CustomerZip
 
rsSecondary|NodeName=0,Parcel
rsSecondary|PK=Route/Stop/Customer/ToteCase$ToteCaseID
rsSecondary|FK=Route$RouteNumber,Route/Stop$StopNumber,Route/Stop/Customer$CustomerNumber
rsSecondary|ParcelReference=Route/Stop/Customer/ToteCase$ToteCaseID
rsSecondary|ParcelPieces=1

Changelog

 Click here to expand...

X Dispatch 21.1

  • rsMaster|Length
  • rsMaster|Width
  • rsMaster|Height
  • rsMaster|DimWtDenominator
  • rsMaster|UseCalculatedDimensionalWeight

X Dispatch 21.0

  • QueueBatchConfigType
  • File Failures - The import plugin will move any failing data files to the ERRORS folder. Previously any data files that failed to preprocess (via preprocessor or XSLT) would have been left in the inbound folder. This caused the import plugin to retry that data file over and over throwing errors while not processing that data file.
    • FailureEmailFromAddress
    • FailureEmailFromName
    • FailureEmailToAddress

X Dispatch 20.0

  • rsMaster|Accessorial1ID
  • rsMaster|Accessorial1Quantity
  • rsMaster|Accessorial2ID
  • rsMaster|Accessorial2Quantity

X Dispatch 19.2

  • Attempts 
  • MaxFailed
  • RetryDelsy(ms)

X Dispatch 18.1

  • LocalizedDateTimes 

X Dispatch 14.2.4

  • MaxReimport 

X Dispatch 14.2.1

  • rsMaster|MiscData|[...]
  • rsSecondary|MiscData|[...]

X Dispatch 13.1.2

  • rsSecondary|ParcelConsolidationBarcode

X Dispatch 12.1.4

  • SkipLinesIfBeginsWith available in XINI

X Dispatch 12.1.3

  • StripBeginningQualifier 
  • StripBeginningQualifier

X Dispatch 12.0.0

  • SkipLinesIfBeginsWith is available in XASN

X Dispatch 11.2.6

  • Added use of modifiers for CXTNOW like with NOW

Version 1.4.5

  • Resolved an issue that prevented CXTNOW from being passed to the system for these fields.

Version 1.4.3

  • Delimiter - The CSV file can an unlimited number of columns. Prior versions was limited to 201 columns.

Version 1.3.0

  • Delimiter value can be set to “Excel” to tell the import process to import Excel formatted files.  This process supports Excel 2003 and Excel 2007 or newer files (*.XLS, *.XLSX, *.XLSB and *.XLSM).

Version 1.2.0

  • ConcurrentDataaFiles