AWS: Simple Storage Service (S3) Helper Class - TransferUtility - C#

In my previous post, we learnt about Amazon S3 and upload / download / copy / delete the files(or objects) of an S3 bucket using C#. The methods given in the previous post uses a Single GET / PUT operations, so that it will upload / download upto 5 GB (Refer here) and we need to use Multipart Upload API. So for implementing the Multipart Upload API concept .NET uses TransferUtility class.

TransferUtility

TransferUtility is a high level utility for managing transfers to and from Amazon S3. It provides a simple API for uploading  and downloading content to/from Amazon S3. It uses Amazon S3 multipart upload API, so you can upload large objects, up to 5 TB. 

It uses multiple threads to upload multiple parts of a single file at once which increase throughput, when dealing with large content sizes and high bandwidth.

Configure the TransferUtility

There are three optional properties that you can configure:

ConcurrentServiceRequests 

Determines how many active threads or the number of concurrent asynchronous web requests will be used to upload/download the file. The default value is 10.

MinSizeBeforePartUpload

Gets or sets the minimum part size for upload parts in bytes. The default is 16 MB. Decreasing the minimum part size causes multipart uploads to be split into a larger number of smaller parts. Setting this value too low has a negative effect on transfer speeds, causing extra latency and network communication for each part.

NumberOfUploadThreads 

Gets or sets the number of executing threads. This property determines how many active threads will be used to upload the file. The default value is 10 threads.

Following is the C#.NET helper method for Upload / Download files using TransferUtility 
public class AmazonS3TransferHelper
{
 AmazonS3Client client;

 private static readonly ILog _logger = LogManager.GetLogger(typeof(AmazonS3TransferHelper));
 private readonly string accessKeyId, secretKey, serviceUrl;
 public AmazonS3TransferHelper(string accessKeyId, string secretKey, string serviceUrl)
 {
  this.accessKeyId = accessKeyId;
  this.secretKey = secretKey;
  this.serviceUrl = serviceUrl;
  client = GetClient();
 }

 /// <summary>
 /// Initializes and returns the AmazonS3 object
 /// </summary>
 /// <returns></returns>
 private AmazonS3Client GetClient()
 {
  if (client == null)
  {
   try
   {
    // S3 config object
    AmazonS3Config clientConfig = new AmazonS3Config
    {
     // Set the endpoint URL
     ServiceURL = serviceUrl
    };
    client = new AmazonS3Client(accessKeyId, secretKey, clientConfig);
   }
   catch (AmazonS3Exception ex)
   { _logger.Error($"Error (AmazonS3Exception) creating S3 client", ex); }
   catch (AmazonServiceException ex)
   { _logger.Error($"Error (AmazonServiceException) creating S3 client", ex); }
   catch (Exception ex)
   { _logger.Error($"Error creating AWS S3 client", ex); }
  }
  return client;
 }

 private TransferUtility GetTransferUtility()
 {
  var config = new TransferUtilityConfig()
  {
   ConcurrentServiceRequests = 10,
   MinSizeBeforePartUpload = 16 * 1024 * 1024
  };

  return new TransferUtility(GetClient(), config);
 }

 /// <summary>
 /// Uploads the file to the S3 bucket. 
 /// </summary>
 /// <param name="bucketNameWithPath">S3 bucket name along with the subfolders. Ex. If you are using a 'dev' folder under bucket 'myBucket', this value should be myBucket/dev</param>
 /// <param name="fileNameInS3">File name used to store the content in the bucket</param>
 /// <param name="fileContent">String content which needs to be stored in the file</param>
 /// <returns></returns>
 public async Task Upload(string bucketNameWithPath, string fileNameInS3, string fileContent)
 {
  _logger.Info("Entering AmazonS3TransferHelper.Upload");
  try
  {
   byte[] byteArray = Encoding.ASCII.GetBytes(fileContent);
   MemoryStream stream = new MemoryStream(byteArray);

   var tranferUtility = GetTransferUtility();
   var transferUploadRequest = new TransferUtilityUploadRequest
   {
    BucketName = bucketNameWithPath,
    Key = fileNameInS3,
    InputStream = stream
   };

   await tranferUtility.UploadAsync(transferUploadRequest); //commensing the transfer  
  }
  catch (Exception ex)
  {
   _logger.Error("Error in uploading file to s3 bucket", ex);
  }

  _logger.Info("Leaving AmazonS3TransferHelper.Upload");
 }

 /// <summary>
 /// Downloads the file from S3 bucket
 /// </summary>
 /// <param name="bucketNameWithPath">S3 bucket name along with the subfolders. Ex. If you are using a 'dev' folder under bucket 'myBucket', this value should be myBucket/dev</param>
 /// <param name="fileNameInS3">File which needs to be get from the bucket</param>
 /// <param name="localFilePath">Local folder path to store the file downloaded from S3</param>
 /// <returns></returns>
 public async Task Download(string bucketNameWithPath, string fileNameInS3, string localFilePath)
 {
  _logger.Info("Entering AmazonS3TransferHelper.Download");
  try
  {
   var tranferUtility = GetTransferUtility();
   var transferDownloadRequest = new TransferUtilityDownloadRequest
   {
    BucketName = bucketNameWithPath,
    Key = fileNameInS3,
    FilePath = localFilePath + "/" + fileNameInS3
   };

   await tranferUtility.DownloadAsync(transferDownloadRequest);
  }
  catch (Exception ex)
  {
   _logger.Error("Error in downloading file to s3 bucket", ex);
  }

  _logger.Info("Leaving AmazonS3TransferHelper.Download");
 }
}

Usage:

Create the helper class object
AmazonS3TransferHelper transferHelper = new AmazonS3TransferHelper(<accesskey>, <secret key>, <AWS S3 endpoint url>);
Upload:
Here, I am uploading S3.txt file to the dev folder in my S3 bucket named gopiBucket with name "S3Sample_1.txt".
string fileContent = File.ReadAllText("S3.txt");
await transferHelper.Upload("gopiBucket/dev", "S3Sample_1.txt", fileContent);
Download a file:
Downloading the file from dev folder to the "Downloads" folder in my solution. If the "Downloads" folder not exits, it will create new one.
await transferHelper.Download("gopiBucket/dev", "S3Sample_1.txt", "Downloads");
Happy Coding  😊!!

Gopikrishna

    Blogger Comment
    Facebook Comment

0 comments:

Post a Comment