为了账号安全,请及时绑定邮箱和手机立即绑定

C# 使用“UploadFromStream”方法将文件从 MongoDB 迁移到

C# 使用“UploadFromStream”方法将文件从 MongoDB 迁移到

Go
倚天杖 2022-11-21 20:17:11
我在从 MongoDB 2 Azure blob 存储迁移文件时遇到问题。下一个方法获取一个 GridFSFile 对象(代表 MongoDB GridFSFileStorage 中的一个文件),然后调用 uploadMemoryStream 方法进行上传。值得一提的是,gridFSFile 在 findById 之后确实有内容,并且 length dows 也有内容,并且该位置最初在 0。gridFSFile.Open 方法创建一个 Stream 对象,然后我将其作为参数传递给上传。private static void iterateOverVersionCollection(Version version, Asset asset){    try    {            string _gridFSId = version.GridFSId;        GridFSFile gridFSFile = gridFSFileStorage.FindById(_gridFSId);        if (gridFSFile == null) return;        string size = version.Name.ToLower();        asset.Size = size;        CloudBlockBlob blockBlob = GetBlockBlobReference(version, gridFSFile, asset);        uploadMemoryStream(blockBlob, gridFSFile, asset);        asset.UploadedOK = true;    }    catch (StorageException ex)    {        asset.UploadedOK = false;        logException(ex, asset);    }}private static void uploadMemoryStream(CloudBlockBlob blockBlob, GridFSFile gridFSFile, Asset asset){      Stream st = gridFSFile.Open();      blockBlob.UploadFromStream(st);}UploadFromStream 需要永远并且永远不会上传,而且要提到的一件事是,无论我如何使用 gridFSFile,如果我尝试使用 Stream.copyTo c# 方法用它创建一个 MemoryStream,它也需要永远并且永无止境所以应用程序卡在 blockBlob.UploadFromStream(st);除了将 gridFSFile.Open 传递给 UploadFromMemoryStream 我还尝试了下一段代码:using (var stream = new MemoryStream()){    byte[] buffer = new byte[2048]; // read in chunks of 2KB    int bytesRead;    while((bytesRead = st.Read(buffer, 0, buffer.Length)) > 0)    {        stream.Write(buffer, 0, bytesRead);    }    byte[] result = stream.ToArray();}但同样,程序卡在 st.Read 行。任何帮助都感激不尽。
查看完整描述

1 回答

?
拉丁的传说

TA贡献1789条经验 获得超8个赞

请注意,由于 UploadFromFileAsync() 或 UploadFromStream 对于巨大的 blob 不是可靠且高效的操作,我建议您考虑以下替代方案:


如果你可以接受命令行工具,你可以尝试 AzCopy,它能够以高性能传输 Azure 存储数据,并且可以暂停和恢复传输。


如果您想以编程方式控制传输作业,请使用Azure Storage Data Movement Library,它是 AzCopy 的核心。相同的示例代码


string storageConnectionString = "myStorageConnectionString";

     CloudStorageAccount account = CloudStorageAccount.Parse(storageConnectionString);

      CloudBlobClient blobClient = account.CreateCloudBlobClient();

      CloudBlobContainer blobContainer = blobClient.GetContainerReference("mycontainer");

      blobContainer.CreateIfNotExistsAsync().Wait();

      string sourcePath = @"C:\Tom\TestLargeFile.zip";

      CloudBlockBlob destBlob = blobContainer.GetBlockBlobReference("LargeFile.zip");

      // Setup the number of the concurrent operations

      TransferManager.Configurations.ParallelOperations = 64;

      // Setup the transfer context and track the upoload progress

      var context = new SingleTransferContext

      {

          ProgressHandler =

          new Progress<TransferStatus>(

               progress => { Console.WriteLine("Bytes uploaded: {0}", progress.BytesTransferred); })

       };

      // Upload a local blob

      TransferManager.UploadAsync(sourcePath, destBlob, null, context, CancellationToken.None).Wait();

      Console.WriteLine("Upload finished !");

      Console.ReadKey();

如果您仍在寻找从流中以编程方式上传文件,我建议您使用以下代码分块上传


var container = _client.GetContainerReference("test");

container.CreateIfNotExists();

var blob = container.GetBlockBlobReference(file.FileName);

var blockDataList = new Dictionary<string, byte[]>();

using (var stream = file.InputStream)

{

    var blockSizeInKB = 1024;

    var offset = 0;

    var index = 0;

    while (offset < stream.Length)

    {

        var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset);

        var blockData = new byte[readLength];

        offset += stream.Read(blockData, 0, readLength);

        blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData);


        index++;

    }

}


Parallel.ForEach(blockDataList, (bi) =>

{

    blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null);

});

blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray());

另一方面,如果您的系统中有可用的文件并想使用 Uploadfile 方法,我们也可以灵活地使用此方法以块的形式上传文件数据


TimeSpan backOffPeriod = TimeSpan.FromSeconds(2);

int retryCount = 1;

BlobRequestOptions bro = new BlobRequestOptions()

{

  SingleBlobUploadThresholdInBytes = 1024 * 1024, //1MB, the minimum

  ParallelOperationThreadCount = 1, 

  RetryPolicy = new ExponentialRetry(backOffPeriod, retryCount),

};

CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(ConnectionString);

CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();

cloudBlobClient.DefaultRequestOptions = bro;

cloudBlobContainer = cloudBlobClient.GetContainerReference(ContainerName);

CloudBlockBlob blob = cloudBlobContainer.GetBlockBlobReference(Path.GetFileName(fileName));

blob.StreamWriteSizeInBytes = 256 * 1024; //256 k

blob.UploadFromFile(fileName, FileMode.Open);

详细解释请浏览


https://www.red-gate.com/simple-talk/cloud/platform-as-a-service/azure-blob-storage-part-4-uploading-large-blobs/


希望能帮助到你。


查看完整回答
反对 回复 2022-11-21
  • 1 回答
  • 0 关注
  • 87 浏览
慕课专栏
更多

添加回答

举报

0/150
提交
取消
意见反馈 帮助中心 APP下载
官方微信