I have an Azure function that gets triggered when a blob gets uploaded to a certain directory. It processes a zip file. When I upload the file to the container with the Azure storage explorer, it works perfectly. When I upload it through web API, it blows up. If I down load that file it seems to be corrupt, however, it is the same size as the source, and when I do a beyond compare between the 2, then seem, identical (and beyond compare seems to see the contents of the zip file).
Here is the upload code:
public async Task<UploadedFileDescription> StoreCatalog(HttpRequestMessage Request)
{
UploadedFileDescription upload = new UploadedFileDescription(); // my return description
var guidString = Guid.NewGuid().ToString();
string fileName = guidString + ".zip"; //add the .zip extenstion
upload.fileName = fileName;
string storageConnection = CloudConfigurationManager.GetSetting("StorageConnectionString");
CloudStorageAccount cloudStorageAccount = CloudStorageAccount.Parse(storageConnection);
CloudBlobClient cloudBlobClient = cloudStorageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("uploadedzips");
CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(fileName);
cloudBlockBlob.Properties.ContentType = "application/x-zip-compressed";
var stream = await Request.Content.ReadAsStreamAsync();
cloudBlockBlob.UploadFromStream(stream);
return upload;
}
The file gets into blob storage fine.
Then the triggered Azure function fires and blows up on this line:
ZipArchive archive = new ZipArchive(myBlob);
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(destinationStorage);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(destinationContainer);
var catalogService = Helpers.container.Resolve<ICatalogService>();
foreach (ZipArchiveEntry file in archive.Entries) <------blows up here with the msg
System.IO.InvalidDataException
HResult=0x80131501
Message=Number of entries expected in End Of Central Directory does not correspond to number of entries in Central Directory.
Source=System.IO.Compression
StackTrace:
at System.IO.Compression.ZipArchive.ReadCentralDirectory()
at System.IO.Compression.ZipArchive.get_Entries()
at DentaCAD.AzureFunctions.ProcessUploadedZip.Run(Stream myBlob, String name, TraceWriter log) in C:\repo\ProcessUploadedZip.cs:line 35
at Microsoft.Azure.WebJobs.Host.Executors.VoidMethodInvoker`2.InvokeAsync(TReflected instance, Object[] arguments)
at Microsoft.Azure.WebJobs.Host.Executors.FunctionInvoker`2.<InvokeAsync>d__9.MoveNext()
When that line gets hit when uploading the file manually, there are 163 entries in the archive (all small jpg
files)
Any thoughts?
it could be a matter of the wrong MIME type.
try changing the content type to "application/zip"
instead of "application/x-zip-compressed"
User contributions licensed under CC BY-SA 3.0