Amazon S3 transfer sites

The Amazon S3 transfer site is intended for file transfers to and from the Amazon Simple Storage Service (S3). The Amazon S3 transfer site can be used with the Basic and Advanced Routing applications for push and pull server-initiated transfers.

The following is the Add Transfer Site page for an Amazon S3 transfer site definition.

The following sections describe the Amazon S3 transfer site configuration options, provide basic flow descriptions for Amazon S3 transfer sites, information on the limited expression language supported by Amazon S3 transfer sites, and known Amazon S3 transfer site limitations:

Configuration options

Configuring an Amazon S3 transfer site consists of making selections and completing fields for the following:

Site settings

Site Settings pane for an Amazon S3 transfer site:

The following table describes the site settings options for an Amazon S3 transfer site.

Field Description
Site Settings
Bucket

The custom property that represents the Amazon S3 bucket name used as the base for upload and download operations.

Example:

com.amazon.axway.entity

Autocreate bucket on upload if it doesn't exist (Optional) Select to automatically create the specified Bucket if it does not exist. Only used for push (upload) operations. If this option is not selected and the Bucket does not exist, the transfer will fail.
Region (Optional)

Predefined set of Amazon S3 regions. The region must match the region where the configured Bucket has been created. For example, if the configured Bucket is created in us-east-1 and the selected region is eu-west-1, the transfers will fail. The available regions are:

  • US Standard (N. Virginia) - us-east-1
  • US West (N. California) - us-west-1
  • US West (Oregon) - us-west-2
  • EU (Frankfurt) - eu-central-1
  • EU (Ireland) - eu-west-1
  • Asia Pacific (Mumbai) - ap-south-1
  • Asia Pacific (Seoul) - ap-northeast-2
  • Asia Pacific (Singapore) - ap-southeast-1
  • Asia Pacific (Sydney) - ap-southeast-2
  • Asia Pacific (Tokyo) - ap-northeast-1
  • South America (São Paulo) - sa-east-1

Network Zone

The network zone that defines the proxies to use for transfers through this site.

  • Select none to connect directly to the remote Amazon S3 server.
  • Select any to allow SecureTransport to select the proxy connection using a network zone that enables an HTTP proxy.
  • Select Default to use the default network zone proxy configuration. If no default is network zone is defined, transfers from this transfer site fail.
  • Select a specific network zone to use the proxy configuration defined for that zone.
Note The Amazon S3 transfer site will only support an HTTP proxy. If the selected Network Zone does not have an HTTP proxy configured, it would be ignored

For more information, see Specify TM Server communication ports and IP address for protocol servers on SecureTransport Edge.

Site credentials

Site Credentials pane for an Amazon S3 transfer site:

The following table describes the site credentials options for an Amazon S3 transfer site.

Field Description
Site Credentials
Access Key ID

The Amazon S3 access key/login ID obtained from the Amazon Identity and Access Management (IAM) console.

Example:

AKIAINGYPMH2LE5MGZDQ

Secret Access Key The secret login key obtained from the Amazon IAM console.

Transfer settings

The following topics describe the download and upload settings.

Download settings

Transfer Settings - Download settings pane for an Amazon S3 transfer site:

The following table describes download transfer settings options for an Amazon S3 transfer site.

Field Description
Transfer Settings - Download settings

Download object key/folder

(Optional)

The site configuration that represents target Amazon S3 object key to download (pull). If the specified object ID corresponds to an S3 folder, the contents of the folder will be downloaded recursively. Any empty folders will be skipped.

If this field is left empty, no objects are pulled. For the root bucket folder, specify only forward slash (/).

The download object key/folder can be evaluated using the SecureTransport expression language.

When the following expression is used, the download folder is evaluated using the current date when the transfer site is executed:

folder_${date("dd.MM.yyyy")}/

Pattern settings
Pattern Type

Select either Regular Expression or File Globbing.

Download Pattern

Pattern matching expression supports File Gobbing and Regular Expression syntaxes.

When File Globbing is selected, the String representation of the filename is matched using a limited pattern language that resembles regular expressions but with a simpler syntax. For example:

  • Matches files ending in .xml
  • *.xml
  • Matches file names starting with foo. that have a double character extension.
  • foo.??
  • Matches file names ending in .1,.2,.3,.4,.5,.6,.7,.8,.9,.0
  • *.[0-9]
  • Matches file names having a single character extension different from 1,2,3,4,5,6,7,8,9,0
  • *.[^0-9]

When Regular Expression is selected, the String representation of the filename is matched against Perl5.003 regular expressions. Extended Perl5 regular expressions are also supported. For example:

  • Matches files ending in .xml or .txt
  • .*\.(xml|txt)
  • Case insensitive match of data.xml file
  • (?i)data\.xml

The download pattern is evaluated using expression language.

  • Using expression language with File Globbing selected:
  • The download pattern is evaluated using the current date when the transfer site is executed. For example: *_20150130.txt. This will match all files ending with _20150130.txt.
  • *_${date("dd.MM.yyyy")}.txt
  • Using expression language with Regular Expression selected:
  • The download pattern is evaluated using the current date when the transfer site is executed. For example [a-z]*_20150130.txt. This will match all files starting with any combination of letters from a to z and ending with _20150130.txt.
  • [a-z]*_${date("dd.MM.yyyy")}.txt
Post actions
Receive File As

Specify a value to receive the file with a different name. A limited expression language can be used to specify a file name.

Available environment variables:

  • ${ts.target}

Examples:

  • file_${random()}
  • ${ts.target}
On Failure

A failure occurs when the transfer is incomplete and all retry attempts were unsuccessful.

Select one of the following options:

No Action - Selecting No Action causes the file to stay in the new location with the file name you specified. If another file with the same name is transferred to this location, the original file is overwritten.

Delete Source File - Selecting Delete Source File removes the file from the source location.

Move/Rename File To - Selecting Move/Rename File To requires you to specify a directory in the location where you are transferring the files to and to provide an expression used to rename the file.

Specify a full path value to rename the remote file on transmission failure or move it to a different remote folder. The limited expression language can be used to specify a file name/folder.

Note The specified Move/Rename File To should not start with the forward slash (/) symbol.

Examples:

  • Use current date as a file name.
  • ${date('yyyyddMMHHmmss')}
  • Use a random ID as a filename.
  • ${random()}
  • Use the target source file.
  • archive/${date("dd.MM.yyyy")}/${ts.target}
On Success

Select one of the following options:

No Action - Selecting No Action causes the file to stay in the new location with the file name you specified. If another file with the same name is transferred to this location, the original file is overwritten.

Delete Source File - Selecting Delete Source File removes the file from the source location.

Move/Rename File To - Selecting Move/Rename File To requires you to specify a directory in the location where you are transferring the files to and to provide an expression used to rename the file.

Specify a full path value to rename the remote file on transmission failure or move it to a different remote folder. The limited expression language can be used to specify a file name/folder.

Note The specified Move/Rename File To should not start with the forward slash (/) symbol.

Examples:

  • Use current date as a file name.
  • ${date('yyyyddMMHHmmss')}
  • Use a random ID as a filename.
  • ${random()}
  • Use the target source file.
  • archive/${date("dd.MM.yyyy")}/${ts.target}

Upload settings

Transfer Settings - Upload settings pane for an Amazon S3 transfer site:

The following table describes upload transfer settings options for an Amazon S3 transfer site.

Field Description
Transfer Settings - Upload settings
Send File As

Specify a value to send the file with a different name. A limited expression language can be used to specify a file name.

Note Do not use Amazon S3 folder forward slash (/) symbol.

Available environment variable:

  • ${ts.target}

Examples:

  • ${random()}_${ts.target}
  • prefix_${ts.target}_suffix
Upload destination

Specifies the detailed upload destination Amazon S3 folder.

The specified destination should not start with Amazon S3 folder forward slash (/) symbol, but it should end with it.

The specified destination path should look like this: folder1/folder2/.../folderN/

If the upload destination is not specified, the file will be uploaded to the main bucket folder.

Available environment variable:

  • ${ts.target}

Examples:

  • folder/${date("dd.MM.yyyy")}/folder/
  • folder/${ts.target}/folder/
Upload Mode

Predefined configuration that presents the Amazon S3 upload options:

  • Auto detect - Represents the recommended practice for file upload by Amazon. Use Single mode for files smaller than 100 MB and Multipart mode for files larger than 100 MB.
  • Single - Upload the file (up to 5 MB) in a single put operation.
  • Multipart - Upload the file in multiple parts, from 5 MB to 5 GB in size. If the file is smaller than 5 MB, Single mode is used.
System defined metadata

List of key=value pair entries for system metadata. The metadata is added as HTTP headers to the uploaded files. The system defined metadata pairs can be evaluated using the SecureTransport expression language.

Examples:

  • Setting server side encryption.
  • x-amz-server-side-encryption=AES256
  • Setting Content Type.
  • Content-Type=text/html; charset=utf-8
  • Setting Cache Control.
  • Cache-Control=max-age=(seconds)
User defined metadata

List of key=value entries for custom metadata to be attached to the uploaded file. The metadata is transformed into x-amz-meta-KEY HTTP headers. The metadata pairs can be evaluated using the SecureTransport expression language.

Examples:

  • Using it target file name.
  • The user defined metadata is evaluated using the target file name while the transfer site is executed.
  • key=${ts.target}
  • Using it with dates.
  • The user defined metadata is evaluated using the current date while the transfer site is executed.
  • key=${date("dd.MM.yyyy")}

Basic flow descriptions

The following topics outline the basic flow descriptions for Amazon S3 transfer site pushes, pulls and debug logging:

Download files

You can configure an Amazon S3 transfer site as a subscription of any SecureTransport application that supports pull operations.

To use the transfer site with a Basic application, perform the following steps:

  1. Create a Basic application. For additional information, see Create a Basic Application application
  2. Create a SecureTransport account. For additional information, see Create a user account.
  3. Create and configure an Amazon S3 transfer site for the account.
  4. Subscribe the account to the Basic application.
  5. To pull files, on the Subscription page specify: For Files Received from this Account or its Partners > Automatically Retrieve Files From: <Amazon S3 transfer site>. Next, define a pull schedule. SecureTransport will pull the file configured in the Download object key property of the transfer site instance to the account’s subscription folder based on the schedule.

If a file with the same name exists (as the one to be downloaded) in the subscription folder at the time the pull operation is performed the existing file will be overwritten with the newly downloaded file.

Example: Download all the content of the bucket folder including subfolders

  • Download object key / folder : /
  • Download Pattern is left empty.

Example: Download all image files from the main bucket folder not including subfolders using File Globbing:

  • Download object key / folder: /main_dir/dir1/
  • Download Pattern: *.jpg

Example: Download all files from the main bucket folder not including subfolders using File Globbing:

  • Download object key / folder: /
  • Download Pattern: *.*

Example: Download image files from the main bucket folder not including subfolders with filename prefix using File Globbing:

  • Download object key / folder: /main_dir/dir1/
  • Download Pattern: Tu*.jpg

Example: Download all image files from the main bucket folder including subfolders using File Globbing:

  • Download object key / folder: /main_dir/dir1/
  • Download Pattern: **.jpg
Note The double asterisk (**) symbol is used to escape Amazon S3 folder separator (/) symbol which is terminating symbol in File Globbing.

Example: Download image files from the main bucket folder including subfolders with filename prefix using File Globbing:

  • Download object key / folder: /main_dir/dir1/
  • Download Pattern: **Tu*.jpg
Note The double asterisk (**) symbol is used to escape Amazon S3 folder separator (/) symbol which is terminating symbol in File Globbing.

Upload files to Amazon S3 storage

You can configure an Amazon S3 transfer site as a subscription of any SecureTransport application that supports push operations.

To use the transfer site with a Basic application, perform the following steps:

  1. Create a Basic application. For additional information, see Create a Basic Application application
  2. Create a SecureTransport account. For additional information, see Create a user account.
  3. Create and configure an Amazon S3 transfer site for the account.
  4. Specify the Upload Destination. If the Upload Destination is not specified, the file will be uploaded to the main bucket folder.
  5. Remember that the forward slash (/) is automatically appended from the Amazon S3 bucket path. If file needs to be uploaded to a specific folder, the folder path should look like this:
  6. folder1/folder2/…/folderN/
  7. Subscribe the account to the Basic application.
  8. To push files, on the Subscription page specify: For Files Sent to this Account or its Partners > Send Files Directly To: <Amazon S3 transfer site>. Whenever a file is received in the account’s subscription folder SecureTransport will push it to the specified Amazon S3 bucket using the configuration in the transfer site.

Debug logging

To enable Amazon S3 transfer site debug logging, edit <FILEDRIVEHOME>/conf/tm-log4j.xml file and add the following two loggers:

<!-- Plugins logger --> <logger name="com.axway.st.plugins" additivity="false"> <level value="debug" /> <appender-ref ref="ServerLog" /> </logger>
<!-- AWS SDK Logger --> <logger name="com.amazonaws " additivity="false"> <level value="debug" /> <appender-ref ref="ServerLog" /> </logger>

Supported expression language

This section outlines the Amazon S3 transfer site limited expression language.

Predefined variables

The predefined variable that is supported:

  • ${timestamp}

Predefined functions

The predefined functions that are supported:

  • Functions related to a date. For example: ${date("yyyyMMdd")}
  • Functions related to a Random ID. For example: ${random()}
  • Functions related to a String representation. For example: ${concat('str', 'ing')}
Note Expression variables and functions related to file name and the SecureTransport environment are not supported.

Added expression variables

On upload and download, the following variable is added:

  • ${ts.target} – The file name of the file that will be downloaded or uploaded. The variable is available on download, list, and upload operations.

Known issues and limitations

The following table lists the known Amazon S3 transfer issues and limitations.

Area Description
Proxy Support Configure an external HTTP Proxy on SecureTransport Edges if S3 transfers will traverse the DMZ through the SecureTransport Edge network zone.

Related topics:

Related Links