Note |
---|
Datetime fields must use the format “yyyyMMdd HHmm". |
Comments are indicated by the '#' character and continue to the end of the line and can be at the beginning of a line, or at the end. Code Block |
---|
| # This is a comment
OriginName=0,CXT Software # This is also a comment |
Configuration items are set using a Key=Value format. Code Block |
---|
| DataPath=C:\ximport\inbox # example comment
DataMask=XML
Inherit=0
ProcPath=C:\ximport\Processed
ErrPath=C:\ximport\Errors
# another comment
URL=http://127.0.0.1/SampleWeb/XMLListener.asp
Delimiter=XML |
Configuration Item | Description |
---|
ConcurrentDataFiles | Optional This value is set to the number of concurrent data files to be processed and will tell the system how many parallel processes to attempt to create. If this value is set to -1 it will tell the system to create as many as it can. Note |
---|
If this value is NOT set, it will default to 1 to mimic current functionality. |
Code Block |
---|
ConcurrentDataFiles=-1 |
| ConcurrentProcesses | Optional Can be set in the X Stream definition to run multiple XASN/XINI files at the same time. | DataMask | Required Specifies the extension of the import files (e.g., xml, txt, *). To disregard the file extension, you may use '*', however, the DataMask item should be set whenever possible to prevent the processing of unintended files. Many times, third-party companies sending the import files will upload the file with a ".tmp" extension and rename the file to the correct extension only once the upload has been completed. If the DataMask is set to '*' these temporary files may be processed before they have been completely transferred. | DataPath | Required Directory to monitor for incoming files. If the directory is not on the local machine, the fully qualified URI (\\server1\customer1\inbox) must be used. Code Block |
---|
DataPath=c:\ximport\inbox |
| Inherit | Optional Specifies whether files moved to the “ProcPath” directory should inherit the permissions of that directory when moved. A value of “1” indicates files should inherit permissions, while a value of “0” indicates the file should retain its current permissions (default behavior). | Delimiter | Required This specifies the type of file to be processed and must be one of: “XML”: The input file is an XML formatted document “0”: The input file is a fixed-width file “TAB”: The input file is tab delimited “,”: The input file is in standard CSV format (See also: "Qualifier" item) Any other character: The input file will be split on this character. "Excel": This functionality will require the installation of the 32 bit Microsoft Access Database Engine 2010 Redistributable.
| ErrPath | Required The directory in which to store any error log files for the import. Code Block |
---|
ErrPath=c:\ximport\errors |
| NumLinesSkipped | Optional Indicates how many lines to skip from the top on an imported flat file. This allows you to bypass header rows in data files. Code Block |
---|
NumLinesSkipped=1 |
| PreProcessor | Optional Full path to the preprocessor executable. Code Block |
---|
PreProcessor=c:\ximport\plugins\eTracPreProcessor.exe |
Note |
---|
If running the preprocessor from a batch file, you must include the full path to the specified preprocessor within the batch file and not use a referential path. Example: c:\ximport\plugins\eTracPreProcessor.exe - would run properly from a batch file .\eTracPreProcessor.exe - would not run properly from a batch file |
| PreprocessorPassInstance | 0/1 to enable/disable passing of the database instance to the pre-processor. Defaults to 0 to not pass the parameter so that it will not cause a regression for any existing preprocessors. | ProcPath | Required The directory in which to store copies of original import file and the log of XML requests used to place an order. Code Block |
---|
ProcPath=c:\ximport\processed |
| Qualifier | Optional Qualifier defaults to the double-quote character and should almost never be modified. The corner case where Qualifier may need to be specified is if the import file format is a hybrid CSV file where escaped data is surrounded with a character other than the double-quote. This case is exceedingly rare. | URL | Required Full URL to the XMLListener.asp file. Code Block |
---|
URL=http://127.0.0.1/XMLListener.asp |
| BaseNode | Required only if using XML for Routed The XML node used to indicate the base node. Code Block |
---|
BaseNode=/OrdersXASN |
| OrderNode | Required only if using XML The XML node used to indicate orders. If listed, relative to BaseNode. Otherwise, it is an absolute path from the root of the document. See DATA ITEMS (XML) below. Code Block |
---|
OrderNode=/Orders/Order |
| ParcelNode | Required only if using XML The XML node used to indicate parcels. See Data Item Types (XML) below. ParcelNode is relative to OrderNode if used in XINI or if BaseNode is specified in routed. Otherwise, it is an absolute path from the root of the document. Info |
---|
This node supports XPath statements. These statements are a powerful method to extend the capability of returning data. Example: if you wanted the ParcelNode to find all subnodes that start with “D_87”, you could use an XPath statement such as this: S_MAN[D_88_1="W"]/*[contains(local-name(),'D_87')] This statement only looks at S_MAN nodes that have a D_88_1 node that is set to “W” and then returns all sub nodes that start with “D_87”. |
| StripBeginningQualifier | Optional If the XML file submitted is wrapped in leading characters, this will prevent X Import from processing the document. This is because having leading characters is not valid XMDelimiterL. In order to modify the file into becoming valid XML, you can specify a mask of characters to strip out from the very beginning of the file if they are found. This could be more than one character and is not a dynamic field, so do not wrap the value in quotes or begin it with 0,. Code Block |
---|
StripBeginningQualifier=” |
| StripEndingQualifier | Required only if using XML If the XML file submitted is wrapped in trailing characters, this will prevent X Import from processing the document. This is because having trailing characters is not valid XML. In order to modify the file into becoming valid XML, you can specify a mask of characters to strip out from the very end of the file if they are found. This could be more than one character and is not a dynamic field, so do not wrap the value in quotes or begin it with 0,. Code Block |
---|
StripEndingQualifier=” |
| SkipLinesIfBeginsWith | Optional Info |
---|
This is only processed for fixed width and delimited imports. It will not be considered in an XML import. |
A regular expression is used to test each line to see if it should be processed. If the regular expression matches the current line, it will be skipped and not be processed. Here is a good reference for regular expressions: http://www.regular-expressions.info/reference.html That regular expression matches any line that does not begin with a double quote ("). Therefore, any line beginning with something other than a double quote would not be processed by X Import. | LogRetentionDays | Required Tip |
---|
Best Practice: Set between 3 and 7 days with the max set to 7 days for Cloud. |
The number of days to keep files that are in the ProcPath (.proc, .original) and ErrPath (.err) directories (EdiProcessedPath (.proc, .original) and EdiErrorsPath (.bad) for EDI). If this setting is omitted from the configuration file the number of days to keep the files defaults to 90. If the value in the configuration file cannot be parsed as an Integer the number of days to keep the files defaults to 90. Code Block |
---|
LogRetentionDays=7 |
| ReImport | Optional Used to reimport the processing of a file in the processed directory that does not have a .proc file. Code Block |
---|
ReImport=1 # Default is 0 ( no reimport ). |
| ReportEmail | Optional The e-mail address to which X Import will send a report about the import process for the current input file. Multiple email addresses should be comma separated. This works for both On Demand and Routed. Code Block |
---|
ReportEmail=text@cxtsoftware.com |
ReportEmail is not stored with the Order, but is used to create an email to be sent. It could be found tblMailLog, tblMailOutbox, or tblMailOutboxHistoric, depending on the time that has passed since the import and the status of email. | MaxReimport | Required The number of times to attempt to re-import failed imports. Omitting this item or setting it to 0 will have no change on import functionality. Any numeric value higher than 0 will tell X Import to try the configured number of times to re-import the failed data.
This feature will only re-import failed orders/stops. Any successfully imported orders/stops will not be re-imported. Info |
---|
If an import fails due to a bad parcel, X Stream will not create the stop or any parcels associated to the stop (even good parcels). A re-import file is generated (regardless to the MaxReimport setting) containing the stop information and only the "good" parcel to re-import (this helps avoid future duplicates if consolidation logic is not used). If the key ReportEmail is in the file, it will trigger an email with the failing reason. |
| LocalizedDateTimes | Optional 0/1 (disable/enable) this tells the Import process that the date/times are already localized to the stop location in the ASN file, and not to localize them against the server date/time. Defaults to 0 (disabled) so that it works as it always has. If LocalizedDateTimes is set to 0 (disabled), this means the pickup/delivery times are not localized in the file and need to be localized to the server (system local) time during the import process. Whereas if LocalizedDateTimes is set to 1 (enable), the times provided are already localized and no further localization action should be taken. Info |
---|
This allows the user to let the import plugin know that the date/time stamp they are passing in (e.g. "07/27/2018 00:00:00") has already been localized. |
Code Block |
---|
LocalizedDateTimes=1 |
| Attempts | Optional Specifies how many times each line is attempted on an import before moving on and throwing an error. Defaults to 1 if not set. Note |
---|
If not managed correctly, it could potentially cause substantial processing delays. Before implementing ensure that you fully understand how the element works with the MaxFailed and RetryDelay(ms) to prevent issues with integration processing times. |
| MaxFailed | Optional Specifies how many lines in an import can fail to import before the remainder of the file is failed and set for reimport later. If not set, the import process will continue and all of the failed entries will be put into a new file for reimport. Note |
---|
If not managed correctly, it could potentially cause substantial processing delays. Before implementing ensure that you fully understand how the element works with the Attempts and RetryDelay(ms) to prevent issues with integration processing times. |
Example If a CSV file has 100 rows, it processes the first 25 successfully, the MaxFailed is set to 15, and the next 15 lines fail, the remaining 60 lines are failed as well. The new file for reimport attempt will have the remaining 75, including the 15 failed entries, and it will be attempted again based on the MaxReimport. | RetryDelay(ms) | Optional Specifies an amount of time in milliseconds to wait in between failed attempts at importing a line from a file. This is the delay for attempting to process each data set. Delimited files would be the delay for processing each row, XML would be a delay for processing each node based on the Attempts value. Defaults to 0 if not set. Note |
---|
If not managed correctly, it could potentially cause substantial processing delays. Before implementing ensure that you fully understand how the element works with the Attempts and MaxFailed to prevent issues with integration processing times. |
| XSLTTranslator | Optional Contains the file name of the XSLT file to use for translation (e.g. C:\IMPORT\TEST.XSLT). The XSLT file can be used to modify the incoming data before it is processed by the Import plugin. | XSLTDataPath | Optional Contains the directory location where to write the XSLT translation output data. This is primarily used for debugging purposes. It allows the user to see the data that was created by the XSLT transformation. | XmlDataPath | Optional Contains the directory location where to write the output that the Import plugin sent to the back end. This is primarily used for debugging purposes. It allows the user to see what data is being sent to the back end server. | XmlPostAlso | Optional Contains a 1 or 0 to enable or disable sending requests to the back end. This value is used in conjunction with XmlDataPath. Note |
---|
This must be set to 1 for the Import plugin to properly process requests and send them to the back end. It is set to 0 only for debugging the Import process without sending the request to the back end server. |
| QueueBatchConfigType | Optional Set to the config type from tblQueue_Batch that will denote import data to processes. How it is configured: CXT Software deploys the CXT: Save Web Service String Data plugin. This plugin creates a web based endpoint that writes whatever is sent to it to the tblQueue_Batch table. This plugin is configured to specify the config type, which will be used in the XASN file. In the XINI/XASN, set the QueueBatchConfigType key to the same config type as configured in the previous step.
How it works: A web service calls the endpoint set up above and sends data as part of the body of the request. This data can be XML, JSON, TAB delimited or FIXED length - EXCEL format is NOT supported. The plugin will store the request in tblQueue_Batch using the specified config type. The import plugin is run and will scan tblQueue_Batch for any records with the given config type. If found, the data from those records will be written to a file in the folder specified by the DataPath parameter. Once all of the records have been written out, the import plugin will process the files as normal.
| DelayImport | Optional Used to delay the processing of a file by a set number of seconds. Code Block |
---|
DelayImport=180 # This is for 3 minutes 60*3 = 180 |
| NameSpacePrefix | Optional Required in some XML inbound documents or if NameSpaceUri is used. Code Block |
---|
NameSpacePrefix =ns2 # This data would be found at the root element in the xml doc |
| NameSpaceUri | Optional Required in some XML inbound documents or if NameSpacePrefix is used. Code Block |
---|
NameSpaceUri =http://internal.amazon.com/TrailerInfo # This data would be found at the root element in the xml doc |
| ReplaceContent | Optional Used to replace content in the document with the matching value with an empty string. Code Block |
---|
ReplaceContent =:TrailerInfo xmlns:“http://www.schema.com” # This will remove any match for this string to be set to the empty string “” |
| ReportEmail | Optional The e-mail address to which X Import will send a report about the import process for the current input file. Multiple email addresses should be comma separated. This works for both On Demand and Routed. Data: ReportEmail is not stored with the Order, but is used to create an email to be sent. It could be found tblMailLog, tblMailOutbox, or tblMailOutboxHistoric, depending on the time that has passed since the import and the status of email. |
The import plugin will move any failing data files to the ERRORS folder. Info |
---|
All file File failures configuration items are optional. However, if implementing this file failure notification feature the following items must be included to send emails as expected: FailureEmailToAddress FailureEmailFromAddress FailureEmailFromName
|
Configuration Item | Description |
---|
FailureEmailFromAddress | Optional FailureEmailFromAddress is a parameter for a return email address for failure emails. This parameter and FailureEmailFromName must be set the same as the setting in the email account. For example, if the email account has the email as email@cxtsoftware.com and the name as John Doe, the FailureEmailFromAddress must be "email@cxtsoftware.com" and the FailureEmailFrom Name must be "John Doe". The best practice is to use the same email address as the one set in the active Mail Manager profile. The failure emails will be sent if the import fails to pre-process or translate using XSLT the inbound data file. The email will be sent with the data file attached along with the failure message. | FailureEmailFromName | Optional FailureEmailFromName is a parameter for a return email name for the failure emails. This parameter and FailureEmailFromAddress must be set the same as the setting in the email account. For example, if the email account has the email as email@cxtsoftware.com and the name as John Doe, the FailureEmailFromAddress must be "email@cxtsoftware.com" and the FailureEmailFrom Name must be "John Doe". The best practice is to use the same email address as the one set in the active Mail Manager profile. The failure emails will be sent if the import fails to pre-process or translate using XSLT the inbound data file. The email will be sent with the data file attached along with the failure message. | FailureEmailToAddress | Optional FailureEmailToAddress is a parameter for a valid email address where the failure email should be sent. The failure emails will be sent if the import fails to pre-process or translate using XSLT the inbound data file. The email will be sent with the data file attached along with the failure message. | FailureEmailOnNoData | Optional FailureEmailOnNoData is a parameter to set whether or not to send an email when there is no data in the data file. Info |
---|
FailureEmailToAddress, FailureEmailFromAddress, and FailureEmailFromName must be set. This is only applicable for EXCEL imports. |
|
Code Block |
---|
| FailureEmailToAddress=email@cxtsoftware.com
FailureEmailFromAddress=monitoring@cxtsoftware.com
FailureEmailFromName=Import Failed
FailureEmailOnNoData=1 |
There are three types of data items: static, dynamic, and macro. Static data items allow you to populate fields with known values. For example, if all of the deliveries sent in on an ASN will be assigned the same order type, origin address, etc. Static data items can be set by prepending the argument with “0,” followed by the static value. Code Block |
---|
UserID=0,ximport-mckesson |
Dynamic data items allow you to set an item’s value based upon the type of input file. Macro data items allow you to populate fields with certain pre-defined macros. Currently, the only supported macros are the following: NOW | Returns the current timestamp at the time of processing, e.g. “04/20/2008 11:11” AKA "Server Time" This would be the time of the server on which X Import is running. Time may be added and subtracted in minute increments by appending the appropriate math after the “NOW” route keyword. | TODAY | Returns the current date at the time of processing with the time set to 12:00 AM, e.g. “04/20/2008 00:00” AKA "Operations App Time" This is the time in the Operations App via dbo.GETCXTDate(). Time may be added and subtracted in day (whole or fractional) increments by appending the appropriate math after the “TODAY” keyword “TODAY+2” would evaluate to “04/22/2008 00:00” “TODAY-2” would evaluate to “04/18/2008 00:00” “TODAY+0.5” would evaluate to “04/20/2008 12:00"
| CXTNOW | Returns the current date at the time of processing within cxtAsp. This would be useful if the time zone of the system running X Import does not line up with the time zone of the Operations App. For example, if a customer on Cloud is set for Eastern time but the server running their instance is set to Pacific or Central time. ReadyTimeFromApplies to the following XINI fields: ReadyTimeTo PickupTime RequestDeliverTime
Applies to the Following XASN Fields: PostDate StopTime StopTimeMin StopTimeMax
If CXTNOW is used in a field outside of what is outlined above, the field will contain the results of NOW instead. |
Code Block |
---|
PickupDate= NOW+360
DeliverDate=TODAY+2 |
Macro data items can also be manipulated to add time offsets. For example, if you receive an ASN for same day delivery and the file comes in at inconsistent times (11 PM one day and 1 AM the next), using the above macros may not be sufficient. For these cases, it is possible to add a time offset and then apply a date transformation. If you want to accept a same day ASN before 3 PM, but roll everything after 3 PM to the next day, you could do the following: Code Block |
---|
PickupDate=NOW&09:01|DAYFLOOR |
The syntax for this type of macro manipulation is in the following form: Item=Macro&[OffsetHours:]OffsetMinutes|Transformation Where: Item is a valid Data Item for the configuration file. Macro must be one of: NOW (see NOTE below) or TODAY. OffsetHours specifies the number of hours to add to the evaluated macro (required) OffsetMinutes specifies the number of minutes to add to the evaluated macro (required) Transformation is one of: DAYFLOOR: Set the resulting time to 00:00 DAYUPPER: Set the resulting time to 00:00 and adds one day WEEKDAYFLOOR: Behaves similarly to DAYFLOOR. If the time, after adding the OffsetHours and OffsetMinutes, is on a Saturday or Sunday, it will be pushed to the following Monday and then set the resulting time to 00:00
NOW is a macro that pulls the SERVER time, not the time local to the customer. The server time for Cloud customers is always UTC and the imported files will show the time local to the customer. In the example above, if the customer’s local time is EST you would need to add 4 hours to the syntax example: Code Block |
---|
PickupDate=NOW&13:01|DAYFLOOR |
|